Jan 26 07:47:41 localhost kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 26 07:47:41 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 26 07:47:41 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 26 07:47:41 localhost kernel: BIOS-provided physical RAM map:
Jan 26 07:47:41 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 26 07:47:41 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 26 07:47:41 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 26 07:47:41 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 26 07:47:41 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 26 07:47:41 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 26 07:47:41 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 26 07:47:41 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable
Jan 26 07:47:41 localhost kernel: NX (Execute Disable) protection: active
Jan 26 07:47:41 localhost kernel: APIC: Static calls initialized
Jan 26 07:47:41 localhost kernel: SMBIOS 2.8 present.
Jan 26 07:47:41 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 26 07:47:41 localhost kernel: Hypervisor detected: KVM
Jan 26 07:47:41 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 26 07:47:41 localhost kernel: kvm-clock: using sched offset of 3210427770 cycles
Jan 26 07:47:41 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 26 07:47:41 localhost kernel: tsc: Detected 2800.000 MHz processor
Jan 26 07:47:41 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Jan 26 07:47:41 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Jan 26 07:47:41 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000
Jan 26 07:47:41 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 26 07:47:41 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 26 07:47:41 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 26 07:47:41 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 26 07:47:41 localhost kernel: Using GB pages for direct mapping
Jan 26 07:47:41 localhost kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 26 07:47:41 localhost kernel: ACPI: Early table checksum verification disabled
Jan 26 07:47:41 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 26 07:47:41 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 07:47:41 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 07:47:41 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 07:47:41 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 26 07:47:41 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 07:47:41 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 07:47:41 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 26 07:47:41 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 26 07:47:41 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 26 07:47:41 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 26 07:47:41 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 26 07:47:41 localhost kernel: No NUMA configuration found
Jan 26 07:47:41 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff]
Jan 26 07:47:41 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd3000-0x43fffdfff]
Jan 26 07:47:41 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 26 07:47:41 localhost kernel: Zone ranges:
Jan 26 07:47:41 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 26 07:47:41 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 26 07:47:41 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000043fffffff]
Jan 26 07:47:41 localhost kernel:   Device   empty
Jan 26 07:47:41 localhost kernel: Movable zone start for each node
Jan 26 07:47:41 localhost kernel: Early memory node ranges
Jan 26 07:47:41 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 26 07:47:41 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 26 07:47:41 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000043fffffff]
Jan 26 07:47:41 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff]
Jan 26 07:47:41 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 26 07:47:41 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 26 07:47:41 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 26 07:47:41 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Jan 26 07:47:41 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 26 07:47:41 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 26 07:47:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 26 07:47:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 26 07:47:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 26 07:47:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 26 07:47:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 26 07:47:41 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 26 07:47:41 localhost kernel: TSC deadline timer available
Jan 26 07:47:41 localhost kernel: CPU topo: Max. logical packages:   8
Jan 26 07:47:41 localhost kernel: CPU topo: Max. logical dies:       8
Jan 26 07:47:41 localhost kernel: CPU topo: Max. dies per package:   1
Jan 26 07:47:41 localhost kernel: CPU topo: Max. threads per core:   1
Jan 26 07:47:41 localhost kernel: CPU topo: Num. cores per package:     1
Jan 26 07:47:41 localhost kernel: CPU topo: Num. threads per package:   1
Jan 26 07:47:41 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 26 07:47:41 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 26 07:47:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 26 07:47:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 26 07:47:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 26 07:47:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 26 07:47:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 26 07:47:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 26 07:47:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 26 07:47:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 26 07:47:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 26 07:47:41 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 26 07:47:41 localhost kernel: Booting paravirtualized kernel on KVM
Jan 26 07:47:41 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 26 07:47:41 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 26 07:47:41 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 26 07:47:41 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Jan 26 07:47:41 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Jan 26 07:47:41 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 26 07:47:41 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 26 07:47:41 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 26 07:47:41 localhost kernel: random: crng init done
Jan 26 07:47:41 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear)
Jan 26 07:47:41 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 26 07:47:41 localhost kernel: Fallback order for Node 0: 0 
Jan 26 07:47:41 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 4128475
Jan 26 07:47:41 localhost kernel: Policy zone: Normal
Jan 26 07:47:41 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 26 07:47:41 localhost kernel: software IO TLB: area num 8.
Jan 26 07:47:41 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 26 07:47:41 localhost kernel: ftrace: allocating 49417 entries in 194 pages
Jan 26 07:47:41 localhost kernel: ftrace: allocated 194 pages with 3 groups
Jan 26 07:47:41 localhost kernel: Dynamic Preempt: voluntary
Jan 26 07:47:41 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 26 07:47:41 localhost kernel: rcu:         RCU event tracing is enabled.
Jan 26 07:47:41 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 26 07:47:41 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Jan 26 07:47:41 localhost kernel:         Rude variant of Tasks RCU enabled.
Jan 26 07:47:41 localhost kernel:         Tracing variant of Tasks RCU enabled.
Jan 26 07:47:41 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 26 07:47:41 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 26 07:47:41 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 26 07:47:41 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 26 07:47:41 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 26 07:47:41 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 26 07:47:41 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 26 07:47:41 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 26 07:47:41 localhost kernel: Console: colour VGA+ 80x25
Jan 26 07:47:41 localhost kernel: printk: console [ttyS0] enabled
Jan 26 07:47:41 localhost kernel: ACPI: Core revision 20230331
Jan 26 07:47:41 localhost kernel: APIC: Switch to symmetric I/O mode setup
Jan 26 07:47:41 localhost kernel: x2apic enabled
Jan 26 07:47:41 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Jan 26 07:47:41 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 26 07:47:41 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Jan 26 07:47:41 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 26 07:47:41 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 26 07:47:41 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 26 07:47:41 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 26 07:47:41 localhost kernel: Spectre V2 : Mitigation: Retpolines
Jan 26 07:47:41 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 26 07:47:41 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 26 07:47:41 localhost kernel: RETBleed: Mitigation: untrained return thunk
Jan 26 07:47:41 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 26 07:47:41 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 26 07:47:41 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 26 07:47:41 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 26 07:47:41 localhost kernel: x86/bugs: return thunk changed
Jan 26 07:47:41 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 26 07:47:41 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 26 07:47:41 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 26 07:47:41 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 26 07:47:41 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 26 07:47:41 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 26 07:47:41 localhost kernel: Freeing SMP alternatives memory: 40K
Jan 26 07:47:41 localhost kernel: pid_max: default: 32768 minimum: 301
Jan 26 07:47:41 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 26 07:47:41 localhost kernel: landlock: Up and running.
Jan 26 07:47:41 localhost kernel: Yama: becoming mindful.
Jan 26 07:47:41 localhost kernel: SELinux:  Initializing.
Jan 26 07:47:41 localhost kernel: LSM support for eBPF active
Jan 26 07:47:41 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Jan 26 07:47:41 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Jan 26 07:47:41 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 26 07:47:41 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 26 07:47:41 localhost kernel: ... version:                0
Jan 26 07:47:41 localhost kernel: ... bit width:              48
Jan 26 07:47:41 localhost kernel: ... generic registers:      6
Jan 26 07:47:41 localhost kernel: ... value mask:             0000ffffffffffff
Jan 26 07:47:41 localhost kernel: ... max period:             00007fffffffffff
Jan 26 07:47:41 localhost kernel: ... fixed-purpose events:   0
Jan 26 07:47:41 localhost kernel: ... event mask:             000000000000003f
Jan 26 07:47:41 localhost kernel: signal: max sigframe size: 1776
Jan 26 07:47:41 localhost kernel: rcu: Hierarchical SRCU implementation.
Jan 26 07:47:41 localhost kernel: rcu:         Max phase no-delay instances is 400.
Jan 26 07:47:41 localhost kernel: smp: Bringing up secondary CPUs ...
Jan 26 07:47:41 localhost kernel: smpboot: x86: Booting SMP configuration:
Jan 26 07:47:41 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 26 07:47:41 localhost kernel: smp: Brought up 1 node, 8 CPUs
Jan 26 07:47:41 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Jan 26 07:47:41 localhost kernel: node 0 deferred pages initialised in 15ms
Jan 26 07:47:41 localhost kernel: Memory: 16008472K/16776676K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 761736K reserved, 0K cma-reserved)
Jan 26 07:47:41 localhost kernel: devtmpfs: initialized
Jan 26 07:47:41 localhost kernel: x86/mm: Memory block size: 128MB
Jan 26 07:47:41 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 26 07:47:41 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 26 07:47:41 localhost kernel: pinctrl core: initialized pinctrl subsystem
Jan 26 07:47:41 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 26 07:47:41 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations
Jan 26 07:47:41 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 26 07:47:41 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 26 07:47:41 localhost kernel: audit: initializing netlink subsys (disabled)
Jan 26 07:47:41 localhost kernel: audit: type=2000 audit(1769413659.712:1): state=initialized audit_enabled=0 res=1
Jan 26 07:47:41 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 26 07:47:41 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 26 07:47:41 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 26 07:47:41 localhost kernel: cpuidle: using governor menu
Jan 26 07:47:41 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 26 07:47:41 localhost kernel: PCI: Using configuration type 1 for base access
Jan 26 07:47:41 localhost kernel: PCI: Using configuration type 1 for extended access
Jan 26 07:47:41 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 26 07:47:41 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 26 07:47:41 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 26 07:47:41 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 26 07:47:41 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 26 07:47:41 localhost kernel: Demotion targets for Node 0: null
Jan 26 07:47:41 localhost kernel: cryptd: max_cpu_qlen set to 1000
Jan 26 07:47:41 localhost kernel: ACPI: Added _OSI(Module Device)
Jan 26 07:47:41 localhost kernel: ACPI: Added _OSI(Processor Device)
Jan 26 07:47:41 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 26 07:47:41 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 26 07:47:41 localhost kernel: ACPI: Interpreter enabled
Jan 26 07:47:41 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 26 07:47:41 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Jan 26 07:47:41 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 26 07:47:41 localhost kernel: PCI: Using E820 reservations for host bridge windows
Jan 26 07:47:41 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 26 07:47:41 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 26 07:47:41 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [3] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [4] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [5] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [6] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [7] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [8] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [9] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [10] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [11] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [12] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [13] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [14] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [15] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [16] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [17] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [18] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [19] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [20] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [21] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [22] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [23] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [24] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [25] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [26] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [27] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [28] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [29] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [30] registered
Jan 26 07:47:41 localhost kernel: acpiphp: Slot [31] registered
Jan 26 07:47:41 localhost kernel: PCI host bridge to bus 0000:00
Jan 26 07:47:41 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 26 07:47:41 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 26 07:47:41 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 26 07:47:41 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 26 07:47:41 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window]
Jan 26 07:47:41 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 26 07:47:41 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 26 07:47:41 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 26 07:47:41 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 26 07:47:41 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 26 07:47:41 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 26 07:47:41 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 26 07:47:41 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 26 07:47:41 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 26 07:47:41 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 26 07:47:41 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 26 07:47:41 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 26 07:47:41 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 26 07:47:41 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 26 07:47:41 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 26 07:47:41 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 26 07:47:41 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 26 07:47:41 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 26 07:47:41 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 26 07:47:41 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 26 07:47:41 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 26 07:47:41 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 26 07:47:41 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 26 07:47:41 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 26 07:47:41 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 26 07:47:41 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 26 07:47:41 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 26 07:47:41 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 26 07:47:41 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 26 07:47:41 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 26 07:47:41 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 26 07:47:41 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 26 07:47:41 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 26 07:47:41 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 26 07:47:41 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 26 07:47:41 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 26 07:47:41 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 26 07:47:41 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 26 07:47:41 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 26 07:47:41 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 26 07:47:41 localhost kernel: iommu: Default domain type: Translated
Jan 26 07:47:41 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 26 07:47:41 localhost kernel: SCSI subsystem initialized
Jan 26 07:47:41 localhost kernel: ACPI: bus type USB registered
Jan 26 07:47:41 localhost kernel: usbcore: registered new interface driver usbfs
Jan 26 07:47:41 localhost kernel: usbcore: registered new interface driver hub
Jan 26 07:47:41 localhost kernel: usbcore: registered new device driver usb
Jan 26 07:47:41 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 26 07:47:41 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 26 07:47:41 localhost kernel: PTP clock support registered
Jan 26 07:47:41 localhost kernel: EDAC MC: Ver: 3.0.0
Jan 26 07:47:41 localhost kernel: NetLabel: Initializing
Jan 26 07:47:41 localhost kernel: NetLabel:  domain hash size = 128
Jan 26 07:47:41 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 26 07:47:41 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Jan 26 07:47:41 localhost kernel: PCI: Using ACPI for IRQ routing
Jan 26 07:47:41 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Jan 26 07:47:41 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Jan 26 07:47:41 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Jan 26 07:47:41 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 26 07:47:41 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 26 07:47:41 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 26 07:47:41 localhost kernel: vgaarb: loaded
Jan 26 07:47:41 localhost kernel: clocksource: Switched to clocksource kvm-clock
Jan 26 07:47:41 localhost kernel: VFS: Disk quotas dquot_6.6.0
Jan 26 07:47:41 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 26 07:47:41 localhost kernel: pnp: PnP ACPI init
Jan 26 07:47:41 localhost kernel: pnp 00:03: [dma 2]
Jan 26 07:47:41 localhost kernel: pnp: PnP ACPI: found 5 devices
Jan 26 07:47:41 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 26 07:47:41 localhost kernel: NET: Registered PF_INET protocol family
Jan 26 07:47:41 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear)
Jan 26 07:47:41 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear)
Jan 26 07:47:41 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 26 07:47:41 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 26 07:47:41 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 26 07:47:41 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536)
Jan 26 07:47:41 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear)
Jan 26 07:47:41 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear)
Jan 26 07:47:41 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear)
Jan 26 07:47:41 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 26 07:47:41 localhost kernel: NET: Registered PF_XDP protocol family
Jan 26 07:47:41 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 26 07:47:41 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 26 07:47:41 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 26 07:47:41 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 26 07:47:41 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window]
Jan 26 07:47:41 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 26 07:47:41 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 26 07:47:41 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 26 07:47:41 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 72571 usecs
Jan 26 07:47:41 localhost kernel: PCI: CLS 0 bytes, default 64
Jan 26 07:47:41 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 26 07:47:41 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 26 07:47:41 localhost kernel: ACPI: bus type thunderbolt registered
Jan 26 07:47:41 localhost kernel: Trying to unpack rootfs image as initramfs...
Jan 26 07:47:41 localhost kernel: Initialise system trusted keyrings
Jan 26 07:47:41 localhost kernel: Key type blacklist registered
Jan 26 07:47:41 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0
Jan 26 07:47:41 localhost kernel: zbud: loaded
Jan 26 07:47:41 localhost kernel: integrity: Platform Keyring initialized
Jan 26 07:47:41 localhost kernel: integrity: Machine keyring initialized
Jan 26 07:47:41 localhost kernel: Freeing initrd memory: 87956K
Jan 26 07:47:41 localhost kernel: NET: Registered PF_ALG protocol family
Jan 26 07:47:41 localhost kernel: xor: automatically using best checksumming function   avx       
Jan 26 07:47:41 localhost kernel: Key type asymmetric registered
Jan 26 07:47:41 localhost kernel: Asymmetric key parser 'x509' registered
Jan 26 07:47:41 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 26 07:47:41 localhost kernel: io scheduler mq-deadline registered
Jan 26 07:47:41 localhost kernel: io scheduler kyber registered
Jan 26 07:47:41 localhost kernel: io scheduler bfq registered
Jan 26 07:47:41 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 26 07:47:41 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 26 07:47:41 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 26 07:47:41 localhost kernel: ACPI: button: Power Button [PWRF]
Jan 26 07:47:41 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 26 07:47:41 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 26 07:47:41 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 26 07:47:41 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 26 07:47:41 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 26 07:47:41 localhost kernel: Non-volatile memory driver v1.3
Jan 26 07:47:41 localhost kernel: rdac: device handler registered
Jan 26 07:47:41 localhost kernel: hp_sw: device handler registered
Jan 26 07:47:41 localhost kernel: emc: device handler registered
Jan 26 07:47:41 localhost kernel: alua: device handler registered
Jan 26 07:47:41 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 26 07:47:41 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 26 07:47:41 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 26 07:47:41 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 26 07:47:41 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 26 07:47:41 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 26 07:47:41 localhost kernel: usb usb1: Product: UHCI Host Controller
Jan 26 07:47:41 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 26 07:47:41 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 26 07:47:41 localhost kernel: hub 1-0:1.0: USB hub found
Jan 26 07:47:41 localhost kernel: hub 1-0:1.0: 2 ports detected
Jan 26 07:47:41 localhost kernel: usbcore: registered new interface driver usbserial_generic
Jan 26 07:47:41 localhost kernel: usbserial: USB Serial support registered for generic
Jan 26 07:47:41 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 26 07:47:41 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 26 07:47:41 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 26 07:47:41 localhost kernel: mousedev: PS/2 mouse device common for all mice
Jan 26 07:47:41 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 26 07:47:41 localhost kernel: rtc_cmos 00:04: registered as rtc0
Jan 26 07:47:41 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 26 07:47:41 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-01-26T07:47:40 UTC (1769413660)
Jan 26 07:47:41 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 26 07:47:41 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 26 07:47:41 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 26 07:47:41 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 26 07:47:41 localhost kernel: usbcore: registered new interface driver usbhid
Jan 26 07:47:41 localhost kernel: usbhid: USB HID core driver
Jan 26 07:47:41 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 26 07:47:41 localhost kernel: drop_monitor: Initializing network drop monitor service
Jan 26 07:47:41 localhost kernel: Initializing XFRM netlink socket
Jan 26 07:47:41 localhost kernel: NET: Registered PF_INET6 protocol family
Jan 26 07:47:41 localhost kernel: Segment Routing with IPv6
Jan 26 07:47:41 localhost kernel: NET: Registered PF_PACKET protocol family
Jan 26 07:47:41 localhost kernel: mpls_gso: MPLS GSO support
Jan 26 07:47:41 localhost kernel: IPI shorthand broadcast: enabled
Jan 26 07:47:41 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Jan 26 07:47:41 localhost kernel: AES CTR mode by8 optimization enabled
Jan 26 07:47:41 localhost kernel: sched_clock: Marking stable (1204001700, 146862390)->(1474180729, -123316639)
Jan 26 07:47:41 localhost kernel: registered taskstats version 1
Jan 26 07:47:41 localhost kernel: Loading compiled-in X.509 certificates
Jan 26 07:47:41 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 26 07:47:41 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 26 07:47:41 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 26 07:47:41 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 26 07:47:41 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 26 07:47:41 localhost kernel: Demotion targets for Node 0: null
Jan 26 07:47:41 localhost kernel: page_owner is disabled
Jan 26 07:47:41 localhost kernel: Key type .fscrypt registered
Jan 26 07:47:41 localhost kernel: Key type fscrypt-provisioning registered
Jan 26 07:47:41 localhost kernel: Key type big_key registered
Jan 26 07:47:41 localhost kernel: Key type encrypted registered
Jan 26 07:47:41 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 26 07:47:41 localhost kernel: Loading compiled-in module X.509 certificates
Jan 26 07:47:41 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 26 07:47:41 localhost kernel: ima: Allocated hash algorithm: sha256
Jan 26 07:47:41 localhost kernel: ima: No architecture policies found
Jan 26 07:47:41 localhost kernel: evm: Initialising EVM extended attributes:
Jan 26 07:47:41 localhost kernel: evm: security.selinux
Jan 26 07:47:41 localhost kernel: evm: security.SMACK64 (disabled)
Jan 26 07:47:41 localhost kernel: evm: security.SMACK64EXEC (disabled)
Jan 26 07:47:41 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 26 07:47:41 localhost kernel: evm: security.SMACK64MMAP (disabled)
Jan 26 07:47:41 localhost kernel: evm: security.apparmor (disabled)
Jan 26 07:47:41 localhost kernel: evm: security.ima
Jan 26 07:47:41 localhost kernel: evm: security.capability
Jan 26 07:47:41 localhost kernel: evm: HMAC attrs: 0x1
Jan 26 07:47:41 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 26 07:47:41 localhost kernel: Running certificate verification RSA selftest
Jan 26 07:47:41 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 26 07:47:41 localhost kernel: Running certificate verification ECDSA selftest
Jan 26 07:47:41 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 26 07:47:41 localhost kernel: clk: Disabling unused clocks
Jan 26 07:47:41 localhost kernel: Freeing unused decrypted memory: 2028K
Jan 26 07:47:41 localhost kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 26 07:47:41 localhost kernel: Write protecting the kernel read-only data: 30720k
Jan 26 07:47:41 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 26 07:47:41 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 26 07:47:41 localhost kernel: Run /init as init process
Jan 26 07:47:41 localhost kernel:   with arguments:
Jan 26 07:47:41 localhost kernel:     /init
Jan 26 07:47:41 localhost kernel:   with environment:
Jan 26 07:47:41 localhost kernel:     HOME=/
Jan 26 07:47:41 localhost kernel:     TERM=linux
Jan 26 07:47:41 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64
Jan 26 07:47:41 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 26 07:47:41 localhost systemd[1]: Detected virtualization kvm.
Jan 26 07:47:41 localhost systemd[1]: Detected architecture x86-64.
Jan 26 07:47:41 localhost systemd[1]: Running in initrd.
Jan 26 07:47:41 localhost systemd[1]: No hostname configured, using default hostname.
Jan 26 07:47:41 localhost systemd[1]: Hostname set to <localhost>.
Jan 26 07:47:41 localhost systemd[1]: Initializing machine ID from VM UUID.
Jan 26 07:47:41 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 26 07:47:41 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 26 07:47:41 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Jan 26 07:47:41 localhost kernel: usb 1-1: Manufacturer: QEMU
Jan 26 07:47:41 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 26 07:47:41 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 26 07:47:41 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 26 07:47:41 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Jan 26 07:47:41 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 26 07:47:41 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 26 07:47:41 localhost systemd[1]: Reached target Initrd /usr File System.
Jan 26 07:47:41 localhost systemd[1]: Reached target Local File Systems.
Jan 26 07:47:41 localhost systemd[1]: Reached target Path Units.
Jan 26 07:47:41 localhost systemd[1]: Reached target Slice Units.
Jan 26 07:47:41 localhost systemd[1]: Reached target Swaps.
Jan 26 07:47:41 localhost systemd[1]: Reached target Timer Units.
Jan 26 07:47:41 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 26 07:47:41 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Jan 26 07:47:41 localhost systemd[1]: Listening on Journal Socket.
Jan 26 07:47:41 localhost systemd[1]: Listening on udev Control Socket.
Jan 26 07:47:41 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 26 07:47:41 localhost systemd[1]: Reached target Socket Units.
Jan 26 07:47:41 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 26 07:47:41 localhost systemd[1]: Starting Journal Service...
Jan 26 07:47:41 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 26 07:47:41 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 26 07:47:41 localhost systemd[1]: Starting Create System Users...
Jan 26 07:47:41 localhost systemd[1]: Starting Setup Virtual Console...
Jan 26 07:47:41 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 26 07:47:41 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 26 07:47:41 localhost systemd[1]: Finished Create System Users.
Jan 26 07:47:41 localhost systemd-journald[310]: Journal started
Jan 26 07:47:41 localhost systemd-journald[310]: Runtime Journal (/run/log/journal/99f843075f5c4de69e22fda82fab04a3) is 8.0M, max 314.6M, 306.6M free.
Jan 26 07:47:41 localhost systemd-sysusers[314]: Creating group 'users' with GID 100.
Jan 26 07:47:41 localhost systemd-sysusers[314]: Creating group 'dbus' with GID 81.
Jan 26 07:47:41 localhost systemd-sysusers[314]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 26 07:47:41 localhost systemd[1]: Started Journal Service.
Jan 26 07:47:41 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 26 07:47:41 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 26 07:47:41 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 26 07:47:41 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 26 07:47:41 localhost systemd[1]: Finished Setup Virtual Console.
Jan 26 07:47:41 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 26 07:47:41 localhost systemd[1]: Starting dracut cmdline hook...
Jan 26 07:47:41 localhost dracut-cmdline[328]: dracut-9 dracut-057-102.git20250818.el9
Jan 26 07:47:41 localhost dracut-cmdline[328]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 26 07:47:41 localhost systemd[1]: Finished dracut cmdline hook.
Jan 26 07:47:41 localhost systemd[1]: Starting dracut pre-udev hook...
Jan 26 07:47:41 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 26 07:47:41 localhost kernel: device-mapper: uevent: version 1.0.3
Jan 26 07:47:41 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 26 07:47:41 localhost kernel: RPC: Registered named UNIX socket transport module.
Jan 26 07:47:41 localhost kernel: RPC: Registered udp transport module.
Jan 26 07:47:41 localhost kernel: RPC: Registered tcp transport module.
Jan 26 07:47:41 localhost kernel: RPC: Registered tcp-with-tls transport module.
Jan 26 07:47:41 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 26 07:47:41 localhost rpc.statd[446]: Version 2.5.4 starting
Jan 26 07:47:41 localhost rpc.statd[446]: Initializing NSM state
Jan 26 07:47:41 localhost rpc.idmapd[451]: Setting log level to 0
Jan 26 07:47:41 localhost systemd[1]: Finished dracut pre-udev hook.
Jan 26 07:47:41 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 26 07:47:42 localhost systemd-udevd[464]: Using default interface naming scheme 'rhel-9.0'.
Jan 26 07:47:42 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 26 07:47:42 localhost systemd[1]: Starting dracut pre-trigger hook...
Jan 26 07:47:42 localhost systemd[1]: Finished dracut pre-trigger hook.
Jan 26 07:47:42 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 26 07:47:42 localhost systemd[1]: Created slice Slice /system/modprobe.
Jan 26 07:47:42 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 26 07:47:42 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 26 07:47:42 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 26 07:47:42 localhost systemd[1]: Mounting Kernel Configuration File System...
Jan 26 07:47:42 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 26 07:47:42 localhost systemd[1]: Mounted Kernel Configuration File System.
Jan 26 07:47:42 localhost systemd[1]: Reached target System Initialization.
Jan 26 07:47:42 localhost systemd[1]: Reached target Basic System.
Jan 26 07:47:42 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 26 07:47:42 localhost systemd[1]: Reached target Network.
Jan 26 07:47:42 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 26 07:47:42 localhost systemd[1]: Starting dracut initqueue hook...
Jan 26 07:47:42 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 26 07:47:42 localhost kernel: virtio_blk virtio2: [vda] 251658240 512-byte logical blocks (129 GB/120 GiB)
Jan 26 07:47:42 localhost kernel: libata version 3.00 loaded.
Jan 26 07:47:42 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Jan 26 07:47:42 localhost kernel:  vda: vda1
Jan 26 07:47:42 localhost kernel: scsi host0: ata_piix
Jan 26 07:47:42 localhost kernel: scsi host1: ata_piix
Jan 26 07:47:42 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 26 07:47:42 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 26 07:47:42 localhost systemd-udevd[480]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 07:47:42 localhost systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 26 07:47:42 localhost systemd[1]: Reached target Initrd Root Device.
Jan 26 07:47:42 localhost kernel: ata1: found unknown device (class 0)
Jan 26 07:47:42 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 26 07:47:42 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 26 07:47:42 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 26 07:47:42 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 26 07:47:42 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 26 07:47:42 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Jan 26 07:47:42 localhost systemd[1]: Finished dracut initqueue hook.
Jan 26 07:47:42 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Jan 26 07:47:42 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Jan 26 07:47:42 localhost systemd[1]: Reached target Remote File Systems.
Jan 26 07:47:42 localhost systemd[1]: Starting dracut pre-mount hook...
Jan 26 07:47:42 localhost systemd[1]: Finished dracut pre-mount hook.
Jan 26 07:47:42 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 26 07:47:42 localhost systemd-fsck[559]: /usr/sbin/fsck.xfs: XFS file system.
Jan 26 07:47:42 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 26 07:47:42 localhost systemd[1]: Mounting /sysroot...
Jan 26 07:47:43 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 26 07:47:43 localhost kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 26 07:47:43 localhost kernel: XFS (vda1): Ending clean mount
Jan 26 07:47:43 localhost systemd[1]: Mounted /sysroot.
Jan 26 07:47:43 localhost systemd[1]: Reached target Initrd Root File System.
Jan 26 07:47:43 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 26 07:47:43 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 26 07:47:43 localhost systemd[1]: Reached target Initrd File Systems.
Jan 26 07:47:43 localhost systemd[1]: Reached target Initrd Default Target.
Jan 26 07:47:43 localhost systemd[1]: Starting dracut mount hook...
Jan 26 07:47:43 localhost systemd[1]: Finished dracut mount hook.
Jan 26 07:47:43 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 26 07:47:43 localhost rpc.idmapd[451]: exiting on signal 15
Jan 26 07:47:43 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 26 07:47:43 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 26 07:47:43 localhost systemd[1]: Stopped target Network.
Jan 26 07:47:43 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 26 07:47:43 localhost systemd[1]: Stopped target Timer Units.
Jan 26 07:47:43 localhost systemd[1]: dbus.socket: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 26 07:47:43 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 26 07:47:43 localhost systemd[1]: Stopped target Initrd Default Target.
Jan 26 07:47:43 localhost systemd[1]: Stopped target Basic System.
Jan 26 07:47:43 localhost systemd[1]: Stopped target Initrd Root Device.
Jan 26 07:47:43 localhost systemd[1]: Stopped target Initrd /usr File System.
Jan 26 07:47:43 localhost systemd[1]: Stopped target Path Units.
Jan 26 07:47:43 localhost systemd[1]: Stopped target Remote File Systems.
Jan 26 07:47:43 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 26 07:47:43 localhost systemd[1]: Stopped target Slice Units.
Jan 26 07:47:43 localhost systemd[1]: Stopped target Socket Units.
Jan 26 07:47:43 localhost systemd[1]: Stopped target System Initialization.
Jan 26 07:47:43 localhost systemd[1]: Stopped target Local File Systems.
Jan 26 07:47:43 localhost systemd[1]: Stopped target Swaps.
Jan 26 07:47:43 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Stopped dracut mount hook.
Jan 26 07:47:43 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Stopped dracut pre-mount hook.
Jan 26 07:47:43 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Jan 26 07:47:43 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 26 07:47:43 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Stopped dracut initqueue hook.
Jan 26 07:47:43 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Stopped Apply Kernel Variables.
Jan 26 07:47:43 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Jan 26 07:47:43 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Stopped Coldplug All udev Devices.
Jan 26 07:47:43 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Stopped dracut pre-trigger hook.
Jan 26 07:47:43 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 26 07:47:43 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Stopped Setup Virtual Console.
Jan 26 07:47:43 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 26 07:47:43 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 26 07:47:43 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Closed udev Control Socket.
Jan 26 07:47:43 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Closed udev Kernel Socket.
Jan 26 07:47:43 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Stopped dracut pre-udev hook.
Jan 26 07:47:43 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Stopped dracut cmdline hook.
Jan 26 07:47:43 localhost systemd[1]: Starting Cleanup udev Database...
Jan 26 07:47:43 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 26 07:47:43 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Jan 26 07:47:43 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Stopped Create System Users.
Jan 26 07:47:43 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 26 07:47:43 localhost systemd[1]: Finished Cleanup udev Database.
Jan 26 07:47:43 localhost systemd[1]: Reached target Switch Root.
Jan 26 07:47:43 localhost systemd[1]: Starting Switch Root...
Jan 26 07:47:43 localhost systemd[1]: Switching root.
Jan 26 07:47:43 localhost systemd-journald[310]: Journal stopped
Jan 26 07:47:44 localhost systemd-journald[310]: Received SIGTERM from PID 1 (systemd).
Jan 26 07:47:44 localhost kernel: audit: type=1404 audit(1769413663.618:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 26 07:47:44 localhost kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 07:47:44 localhost kernel: SELinux:  policy capability open_perms=1
Jan 26 07:47:44 localhost kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 07:47:44 localhost kernel: SELinux:  policy capability always_check_network=0
Jan 26 07:47:44 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 07:47:44 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 07:47:44 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 07:47:44 localhost kernel: audit: type=1403 audit(1769413663.751:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 26 07:47:44 localhost systemd[1]: Successfully loaded SELinux policy in 137.460ms.
Jan 26 07:47:44 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 37.915ms.
Jan 26 07:47:44 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 26 07:47:44 localhost systemd[1]: Detected virtualization kvm.
Jan 26 07:47:44 localhost systemd[1]: Detected architecture x86-64.
Jan 26 07:47:44 localhost systemd-rc-local-generator[641]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 07:47:44 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Jan 26 07:47:44 localhost systemd[1]: Stopped Switch Root.
Jan 26 07:47:44 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 26 07:47:44 localhost systemd[1]: Created slice Slice /system/getty.
Jan 26 07:47:44 localhost systemd[1]: Created slice Slice /system/serial-getty.
Jan 26 07:47:44 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Jan 26 07:47:44 localhost systemd[1]: Created slice User and Session Slice.
Jan 26 07:47:44 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 26 07:47:44 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Jan 26 07:47:44 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 26 07:47:44 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 26 07:47:44 localhost systemd[1]: Stopped target Switch Root.
Jan 26 07:47:44 localhost systemd[1]: Stopped target Initrd File Systems.
Jan 26 07:47:44 localhost systemd[1]: Stopped target Initrd Root File System.
Jan 26 07:47:44 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Jan 26 07:47:44 localhost systemd[1]: Reached target Path Units.
Jan 26 07:47:44 localhost systemd[1]: Reached target rpc_pipefs.target.
Jan 26 07:47:44 localhost systemd[1]: Reached target Slice Units.
Jan 26 07:47:44 localhost systemd[1]: Reached target Swaps.
Jan 26 07:47:44 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Jan 26 07:47:44 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Jan 26 07:47:44 localhost systemd[1]: Reached target RPC Port Mapper.
Jan 26 07:47:44 localhost systemd[1]: Listening on Process Core Dump Socket.
Jan 26 07:47:44 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Jan 26 07:47:44 localhost systemd[1]: Listening on udev Control Socket.
Jan 26 07:47:44 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 26 07:47:44 localhost systemd[1]: Mounting Huge Pages File System...
Jan 26 07:47:44 localhost systemd[1]: Mounting POSIX Message Queue File System...
Jan 26 07:47:44 localhost systemd[1]: Mounting Kernel Debug File System...
Jan 26 07:47:44 localhost systemd[1]: Mounting Kernel Trace File System...
Jan 26 07:47:44 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 26 07:47:44 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 26 07:47:44 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 26 07:47:44 localhost systemd[1]: Starting Load Kernel Module drm...
Jan 26 07:47:44 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Jan 26 07:47:44 localhost systemd[1]: Starting Load Kernel Module fuse...
Jan 26 07:47:44 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 26 07:47:44 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Jan 26 07:47:44 localhost systemd[1]: Stopped File System Check on Root Device.
Jan 26 07:47:44 localhost systemd[1]: Stopped Journal Service.
Jan 26 07:47:44 localhost kernel: fuse: init (API version 7.37)
Jan 26 07:47:44 localhost systemd[1]: Starting Journal Service...
Jan 26 07:47:44 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 26 07:47:44 localhost systemd[1]: Starting Generate network units from Kernel command line...
Jan 26 07:47:44 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 26 07:47:44 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Jan 26 07:47:44 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 26 07:47:44 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 26 07:47:44 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 26 07:47:44 localhost kernel: ACPI: bus type drm_connector registered
Jan 26 07:47:44 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 26 07:47:44 localhost systemd[1]: Mounted Huge Pages File System.
Jan 26 07:47:44 localhost systemd[1]: Mounted POSIX Message Queue File System.
Jan 26 07:47:44 localhost systemd[1]: Mounted Kernel Debug File System.
Jan 26 07:47:44 localhost systemd[1]: Mounted Kernel Trace File System.
Jan 26 07:47:44 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 26 07:47:44 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 26 07:47:44 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 26 07:47:44 localhost systemd-journald[682]: Journal started
Jan 26 07:47:44 localhost systemd-journald[682]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 314.6M, 306.6M free.
Jan 26 07:47:44 localhost systemd[1]: Queued start job for default target Multi-User System.
Jan 26 07:47:44 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 26 07:47:44 localhost systemd[1]: Started Journal Service.
Jan 26 07:47:44 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 26 07:47:44 localhost systemd[1]: Finished Load Kernel Module drm.
Jan 26 07:47:44 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 26 07:47:44 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 26 07:47:44 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 26 07:47:44 localhost systemd[1]: Finished Load Kernel Module fuse.
Jan 26 07:47:44 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 26 07:47:44 localhost systemd[1]: Finished Generate network units from Kernel command line.
Jan 26 07:47:44 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 26 07:47:44 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 26 07:47:44 localhost systemd[1]: Mounting FUSE Control File System...
Jan 26 07:47:44 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 26 07:47:44 localhost systemd[1]: Starting Rebuild Hardware Database...
Jan 26 07:47:44 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 26 07:47:44 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 26 07:47:44 localhost systemd[1]: Starting Load/Save OS Random Seed...
Jan 26 07:47:44 localhost systemd[1]: Starting Create System Users...
Jan 26 07:47:44 localhost systemd-journald[682]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 314.6M, 306.6M free.
Jan 26 07:47:44 localhost systemd-journald[682]: Received client request to flush runtime journal.
Jan 26 07:47:44 localhost systemd[1]: Mounted FUSE Control File System.
Jan 26 07:47:44 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 26 07:47:44 localhost systemd[1]: Finished Load/Save OS Random Seed.
Jan 26 07:47:44 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 26 07:47:44 localhost systemd[1]: Finished Create System Users.
Jan 26 07:47:44 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 26 07:47:44 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 26 07:47:44 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 26 07:47:44 localhost systemd[1]: Reached target Preparation for Local File Systems.
Jan 26 07:47:44 localhost systemd[1]: Reached target Local File Systems.
Jan 26 07:47:44 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 26 07:47:44 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 26 07:47:44 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 26 07:47:44 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 26 07:47:44 localhost systemd[1]: Starting Automatic Boot Loader Update...
Jan 26 07:47:44 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 26 07:47:44 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 26 07:47:44 localhost bootctl[699]: Couldn't find EFI system partition, skipping.
Jan 26 07:47:44 localhost systemd[1]: Finished Automatic Boot Loader Update.
Jan 26 07:47:44 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 26 07:47:44 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 26 07:47:44 localhost systemd[1]: Starting Security Auditing Service...
Jan 26 07:47:44 localhost systemd[1]: Starting RPC Bind...
Jan 26 07:47:44 localhost systemd[1]: Starting Rebuild Journal Catalog...
Jan 26 07:47:44 localhost auditd[705]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 26 07:47:44 localhost auditd[705]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 26 07:47:44 localhost systemd[1]: Finished Rebuild Journal Catalog.
Jan 26 07:47:44 localhost augenrules[710]: /sbin/augenrules: No change
Jan 26 07:47:44 localhost systemd[1]: Started RPC Bind.
Jan 26 07:47:44 localhost augenrules[725]: No rules
Jan 26 07:47:44 localhost augenrules[725]: enabled 1
Jan 26 07:47:44 localhost augenrules[725]: failure 1
Jan 26 07:47:44 localhost augenrules[725]: pid 705
Jan 26 07:47:44 localhost augenrules[725]: rate_limit 0
Jan 26 07:47:44 localhost augenrules[725]: backlog_limit 8192
Jan 26 07:47:44 localhost augenrules[725]: lost 0
Jan 26 07:47:44 localhost augenrules[725]: backlog 0
Jan 26 07:47:44 localhost augenrules[725]: backlog_wait_time 60000
Jan 26 07:47:44 localhost augenrules[725]: backlog_wait_time_actual 0
Jan 26 07:47:44 localhost augenrules[725]: enabled 1
Jan 26 07:47:44 localhost augenrules[725]: failure 1
Jan 26 07:47:44 localhost augenrules[725]: pid 705
Jan 26 07:47:44 localhost augenrules[725]: rate_limit 0
Jan 26 07:47:44 localhost augenrules[725]: backlog_limit 8192
Jan 26 07:47:44 localhost augenrules[725]: lost 0
Jan 26 07:47:44 localhost augenrules[725]: backlog 0
Jan 26 07:47:44 localhost augenrules[725]: backlog_wait_time 60000
Jan 26 07:47:44 localhost augenrules[725]: backlog_wait_time_actual 0
Jan 26 07:47:44 localhost augenrules[725]: enabled 1
Jan 26 07:47:44 localhost augenrules[725]: failure 1
Jan 26 07:47:44 localhost augenrules[725]: pid 705
Jan 26 07:47:44 localhost augenrules[725]: rate_limit 0
Jan 26 07:47:44 localhost augenrules[725]: backlog_limit 8192
Jan 26 07:47:44 localhost augenrules[725]: lost 0
Jan 26 07:47:44 localhost augenrules[725]: backlog 0
Jan 26 07:47:44 localhost augenrules[725]: backlog_wait_time 60000
Jan 26 07:47:44 localhost augenrules[725]: backlog_wait_time_actual 0
Jan 26 07:47:44 localhost systemd[1]: Started Security Auditing Service.
Jan 26 07:47:44 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 26 07:47:44 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 26 07:47:45 localhost systemd[1]: Finished Rebuild Hardware Database.
Jan 26 07:47:45 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 26 07:47:45 localhost systemd[1]: Starting Update is Completed...
Jan 26 07:47:45 localhost systemd[1]: Finished Update is Completed.
Jan 26 07:47:45 localhost systemd-udevd[733]: Using default interface naming scheme 'rhel-9.0'.
Jan 26 07:47:45 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 26 07:47:45 localhost systemd[1]: Reached target System Initialization.
Jan 26 07:47:45 localhost systemd[1]: Started dnf makecache --timer.
Jan 26 07:47:45 localhost systemd[1]: Started Daily rotation of log files.
Jan 26 07:47:45 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 26 07:47:45 localhost systemd[1]: Reached target Timer Units.
Jan 26 07:47:45 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 26 07:47:45 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 26 07:47:45 localhost systemd[1]: Reached target Socket Units.
Jan 26 07:47:45 localhost systemd[1]: Starting D-Bus System Message Bus...
Jan 26 07:47:45 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 26 07:47:45 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 26 07:47:45 localhost systemd-udevd[736]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 07:47:45 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 26 07:47:45 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 26 07:47:45 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 26 07:47:45 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 26 07:47:45 localhost systemd[1]: Started D-Bus System Message Bus.
Jan 26 07:47:45 localhost systemd[1]: Reached target Basic System.
Jan 26 07:47:45 localhost dbus-broker-lau[767]: Ready
Jan 26 07:47:45 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 26 07:47:45 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 26 07:47:45 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 26 07:47:45 localhost systemd[1]: Starting NTP client/server...
Jan 26 07:47:45 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 26 07:47:45 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 26 07:47:45 localhost systemd[1]: Starting IPv4 firewall with iptables...
Jan 26 07:47:45 localhost systemd[1]: Started irqbalance daemon.
Jan 26 07:47:45 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 26 07:47:45 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 07:47:45 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 07:47:45 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 07:47:45 localhost systemd[1]: Reached target sshd-keygen.target.
Jan 26 07:47:45 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 26 07:47:45 localhost systemd[1]: Reached target User and Group Name Lookups.
Jan 26 07:47:45 localhost systemd[1]: Starting User Login Management...
Jan 26 07:47:45 localhost chronyd[790]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 26 07:47:45 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 26 07:47:45 localhost chronyd[790]: Loaded 0 symmetric keys
Jan 26 07:47:45 localhost chronyd[790]: Using right/UTC timezone to obtain leap second data
Jan 26 07:47:45 localhost chronyd[790]: Loaded seccomp filter (level 2)
Jan 26 07:47:45 localhost systemd[1]: Started NTP client/server.
Jan 26 07:47:45 localhost systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 26 07:47:45 localhost systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 26 07:47:45 localhost systemd-logind[788]: New seat seat0.
Jan 26 07:47:45 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 26 07:47:45 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 26 07:47:45 localhost kernel: Console: switching to colour dummy device 80x25
Jan 26 07:47:45 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 26 07:47:45 localhost kernel: [drm] features: -context_init
Jan 26 07:47:45 localhost systemd[1]: Started User Login Management.
Jan 26 07:47:45 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 26 07:47:45 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 26 07:47:45 localhost kernel: [drm] number of scanouts: 1
Jan 26 07:47:45 localhost kernel: [drm] number of cap sets: 0
Jan 26 07:47:45 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 26 07:47:45 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 26 07:47:45 localhost kernel: Console: switching to colour frame buffer device 128x48
Jan 26 07:47:45 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 26 07:47:45 localhost kernel: kvm_amd: TSC scaling supported
Jan 26 07:47:45 localhost kernel: kvm_amd: Nested Virtualization enabled
Jan 26 07:47:45 localhost kernel: kvm_amd: Nested Paging enabled
Jan 26 07:47:45 localhost kernel: kvm_amd: LBR virtualization supported
Jan 26 07:47:45 localhost iptables.init[781]: iptables: Applying firewall rules: [  OK  ]
Jan 26 07:47:45 localhost systemd[1]: Finished IPv4 firewall with iptables.
Jan 26 07:47:45 localhost cloud-init[841]: Cloud-init v. 24.4-8.el9 running 'init-local' at Mon, 26 Jan 2026 07:47:45 +0000. Up 6.48 seconds.
Jan 26 07:47:46 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Jan 26 07:47:46 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Jan 26 07:47:46 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpjkrvuvko.mount: Deactivated successfully.
Jan 26 07:47:46 localhost systemd[1]: Starting Hostname Service...
Jan 26 07:47:46 localhost systemd[1]: Started Hostname Service.
Jan 26 07:47:46 np0005595389.novalocal systemd-hostnamed[855]: Hostname set to <np0005595389.novalocal> (static)
Jan 26 07:47:46 np0005595389.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 26 07:47:46 np0005595389.novalocal systemd[1]: Reached target Preparation for Network.
Jan 26 07:47:46 np0005595389.novalocal systemd[1]: Starting Network Manager...
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.3652] NetworkManager (version 1.54.3-2.el9) is starting... (boot:ac323890-fec5-4361-852a-4b7b8dc1d6fe)
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.3660] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.3786] manager[0x560d87e8d000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.3849] hostname: hostname: using hostnamed
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.3850] hostname: static hostname changed from (none) to "np0005595389.novalocal"
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.3861] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.3975] manager[0x560d87e8d000]: rfkill: Wi-Fi hardware radio set enabled
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.3975] manager[0x560d87e8d000]: rfkill: WWAN hardware radio set enabled
Jan 26 07:47:46 np0005595389.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4049] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4049] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4051] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4052] manager: Networking is enabled by state file
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4059] settings: Loaded settings plugin: keyfile (internal)
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4086] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4123] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4142] dhcp: init: Using DHCP client 'internal'
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4147] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4174] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4188] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4204] device (lo): Activation: starting connection 'lo' (b8f0ada8-b516-476a-9a84-27149369f44b)
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4222] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4228] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4275] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4283] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 26 07:47:46 np0005595389.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4287] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4292] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4295] device (eth0): carrier: link connected
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4302] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4314] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4325] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4332] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4333] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 07:47:46 np0005595389.novalocal systemd[1]: Started Network Manager.
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4337] manager: NetworkManager state is now CONNECTING
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4342] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4359] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4365] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 07:47:46 np0005595389.novalocal systemd[1]: Reached target Network.
Jan 26 07:47:46 np0005595389.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 26 07:47:46 np0005595389.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4535] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4539] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 26 07:47:46 np0005595389.novalocal NetworkManager[859]: <info>  [1769413666.4549] device (lo): Activation: successful, device activated.
Jan 26 07:47:46 np0005595389.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 07:47:46 np0005595389.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Jan 26 07:47:46 np0005595389.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 26 07:47:46 np0005595389.novalocal systemd[1]: Reached target NFS client services.
Jan 26 07:47:46 np0005595389.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Jan 26 07:47:46 np0005595389.novalocal systemd[1]: Reached target Remote File Systems.
Jan 26 07:47:46 np0005595389.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 26 07:47:48 np0005595389.novalocal NetworkManager[859]: <info>  [1769413668.1233] dhcp4 (eth0): state changed new lease, address=38.102.83.74
Jan 26 07:47:48 np0005595389.novalocal NetworkManager[859]: <info>  [1769413668.1253] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 26 07:47:48 np0005595389.novalocal NetworkManager[859]: <info>  [1769413668.1292] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 07:47:48 np0005595389.novalocal NetworkManager[859]: <info>  [1769413668.1332] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 07:47:48 np0005595389.novalocal NetworkManager[859]: <info>  [1769413668.1335] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 07:47:48 np0005595389.novalocal NetworkManager[859]: <info>  [1769413668.1339] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 07:47:48 np0005595389.novalocal NetworkManager[859]: <info>  [1769413668.1344] device (eth0): Activation: successful, device activated.
Jan 26 07:47:48 np0005595389.novalocal NetworkManager[859]: <info>  [1769413668.1353] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 26 07:47:48 np0005595389.novalocal NetworkManager[859]: <info>  [1769413668.1358] manager: startup complete
Jan 26 07:47:48 np0005595389.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 26 07:47:48 np0005595389.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: Cloud-init v. 24.4-8.el9 running 'init' at Mon, 26 Jan 2026 07:47:48 +0000. Up 9.15 seconds.
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: |  eth0  | True |         38.102.83.74         | 255.255.255.0 | global | fa:16:3e:fc:69:df |
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: |  eth0  | True | fe80::f816:3eff:fefc:69df/64 |       .       |  link  | fa:16:3e:fc:69:df |
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Jan 26 07:47:48 np0005595389.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 26 07:47:49 np0005595389.novalocal useradd[989]: new group: name=cloud-user, GID=1001
Jan 26 07:47:49 np0005595389.novalocal useradd[989]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Jan 26 07:47:49 np0005595389.novalocal useradd[989]: add 'cloud-user' to group 'adm'
Jan 26 07:47:49 np0005595389.novalocal useradd[989]: add 'cloud-user' to group 'systemd-journal'
Jan 26 07:47:49 np0005595389.novalocal useradd[989]: add 'cloud-user' to shadow group 'adm'
Jan 26 07:47:49 np0005595389.novalocal useradd[989]: add 'cloud-user' to shadow group 'systemd-journal'
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: Generating public/private rsa key pair.
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: The key fingerprint is:
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: SHA256:rchLnkN04bcFI2rbm3OoMJO6M0MQetnpgupw5jeEXqI root@np0005595389.novalocal
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: The key's randomart image is:
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: +---[RSA 3072]----+
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |                 |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |.       o o      |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |.. o . o o o     |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |o o o + o.. .    |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: | + o o +S..o     |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |. = +oo....      |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |o+o==.+ .+       |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |E+=.o*.o= .      |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |..+* .=o o       |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: +----[SHA256]-----+
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: Generating public/private ecdsa key pair.
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: The key fingerprint is:
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: SHA256:5rCxs+in0vL6V1sh2ZKe34atogPBLcyN1jHrbC4SUhQ root@np0005595389.novalocal
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: The key's randomart image is:
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: +---[ECDSA 256]---+
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |  E.             |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |  .   o          |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: | . + = + +       |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |  . O = = o      |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: | . . =o.S+ .     |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |. . . +B+ .      |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: | . o ++..+ +     |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |  + o.=oo o +    |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |  .O*=oo ..o     |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: +----[SHA256]-----+
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: Generating public/private ed25519 key pair.
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: The key fingerprint is:
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: SHA256:YINlmu6smExT9pbtzBQoWTRsK4fsvkTrQs5efuZ6Ssw root@np0005595389.novalocal
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: The key's randomart image is:
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: +--[ED25519 256]--+
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |    .oo          |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |    .O.          |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |  . *.=          |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |   =o+.o         |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |  .*+. .S        |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: | .*++ o .        |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |+o.Eo+ o         |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |o*Bo.+=          |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: |+oo*O. +         |
Jan 26 07:47:50 np0005595389.novalocal cloud-init[922]: +----[SHA256]-----+
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Reached target Cloud-config availability.
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Reached target Network is Online.
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Starting Crash recovery kernel arming...
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Starting System Logging Service...
Jan 26 07:47:50 np0005595389.novalocal sm-notify[1005]: Version 2.5.4 starting
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Starting OpenSSH server daemon...
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Starting Permit User Sessions...
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Started Notify NFS peers of a restart.
Jan 26 07:47:50 np0005595389.novalocal sshd[1007]: Server listening on 0.0.0.0 port 22.
Jan 26 07:47:50 np0005595389.novalocal sshd[1007]: Server listening on :: port 22.
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Started OpenSSH server daemon.
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Finished Permit User Sessions.
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Started Command Scheduler.
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Started Getty on tty1.
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Started Serial Getty on ttyS0.
Jan 26 07:47:50 np0005595389.novalocal crond[1010]: (CRON) STARTUP (1.5.7)
Jan 26 07:47:50 np0005595389.novalocal crond[1010]: (CRON) INFO (Syslog will be used instead of sendmail.)
Jan 26 07:47:50 np0005595389.novalocal crond[1010]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 25% if used.)
Jan 26 07:47:50 np0005595389.novalocal crond[1010]: (CRON) INFO (running with inotify support)
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Reached target Login Prompts.
Jan 26 07:47:50 np0005595389.novalocal rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Jan 26 07:47:50 np0005595389.novalocal rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Started System Logging Service.
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Reached target Multi-User System.
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 26 07:47:50 np0005595389.novalocal rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 07:47:50 np0005595389.novalocal kdumpctl[1015]: kdump: No kdump initial ramdisk found.
Jan 26 07:47:50 np0005595389.novalocal kdumpctl[1015]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 26 07:47:50 np0005595389.novalocal cloud-init[1134]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Mon, 26 Jan 2026 07:47:50 +0000. Up 11.30 seconds.
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Jan 26 07:47:50 np0005595389.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Jan 26 07:47:51 np0005595389.novalocal dracut[1268]: dracut-057-102.git20250818.el9
Jan 26 07:47:51 np0005595389.novalocal cloud-init[1283]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Mon, 26 Jan 2026 07:47:51 +0000. Up 11.72 seconds.
Jan 26 07:47:51 np0005595389.novalocal cloud-init[1286]: #############################################################
Jan 26 07:47:51 np0005595389.novalocal cloud-init[1287]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 26 07:47:51 np0005595389.novalocal cloud-init[1289]: 256 SHA256:5rCxs+in0vL6V1sh2ZKe34atogPBLcyN1jHrbC4SUhQ root@np0005595389.novalocal (ECDSA)
Jan 26 07:47:51 np0005595389.novalocal cloud-init[1291]: 256 SHA256:YINlmu6smExT9pbtzBQoWTRsK4fsvkTrQs5efuZ6Ssw root@np0005595389.novalocal (ED25519)
Jan 26 07:47:51 np0005595389.novalocal cloud-init[1293]: 3072 SHA256:rchLnkN04bcFI2rbm3OoMJO6M0MQetnpgupw5jeEXqI root@np0005595389.novalocal (RSA)
Jan 26 07:47:51 np0005595389.novalocal cloud-init[1294]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 26 07:47:51 np0005595389.novalocal cloud-init[1295]: #############################################################
Jan 26 07:47:51 np0005595389.novalocal cloud-init[1283]: Cloud-init v. 24.4-8.el9 finished at Mon, 26 Jan 2026 07:47:51 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.94 seconds
Jan 26 07:47:51 np0005595389.novalocal dracut[1270]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 26 07:47:51 np0005595389.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Jan 26 07:47:51 np0005595389.novalocal systemd[1]: Reached target Cloud-init target.
Jan 26 07:47:51 np0005595389.novalocal sshd-session[1334]: Unable to negotiate with 38.102.83.114 port 41608: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Jan 26 07:47:51 np0005595389.novalocal sshd-session[1354]: Unable to negotiate with 38.102.83.114 port 41614: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Jan 26 07:47:51 np0005595389.novalocal sshd-session[1359]: Unable to negotiate with 38.102.83.114 port 41620: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Jan 26 07:47:51 np0005595389.novalocal sshd-session[1322]: Connection closed by 38.102.83.114 port 41600 [preauth]
Jan 26 07:47:51 np0005595389.novalocal sshd-session[1361]: Connection closed by 38.102.83.114 port 41624 [preauth]
Jan 26 07:47:51 np0005595389.novalocal sshd-session[1346]: Connection closed by 38.102.83.114 port 41610 [preauth]
Jan 26 07:47:51 np0005595389.novalocal sshd-session[1371]: Unable to negotiate with 38.102.83.114 port 41638: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Jan 26 07:47:51 np0005595389.novalocal sshd-session[1378]: Unable to negotiate with 38.102.83.114 port 41654: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Jan 26 07:47:51 np0005595389.novalocal sshd-session[1368]: Connection closed by 38.102.83.114 port 41626 [preauth]
Jan 26 07:47:51 np0005595389.novalocal dracut[1270]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 26 07:47:51 np0005595389.novalocal dracut[1270]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 26 07:47:51 np0005595389.novalocal dracut[1270]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 26 07:47:51 np0005595389.novalocal dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 26 07:47:51 np0005595389.novalocal dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 26 07:47:51 np0005595389.novalocal dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 26 07:47:51 np0005595389.novalocal dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 26 07:47:51 np0005595389.novalocal dracut[1270]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 26 07:47:51 np0005595389.novalocal dracut[1270]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 26 07:47:51 np0005595389.novalocal dracut[1270]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 26 07:47:51 np0005595389.novalocal dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 26 07:47:51 np0005595389.novalocal dracut[1270]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 26 07:47:51 np0005595389.novalocal dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 26 07:47:51 np0005595389.novalocal dracut[1270]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 26 07:47:51 np0005595389.novalocal dracut[1270]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: Module 'resume' will not be installed, because it's in the list to be omitted!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: memstrack is not available
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: memstrack is not available
Jan 26 07:47:52 np0005595389.novalocal dracut[1270]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 26 07:47:53 np0005595389.novalocal chronyd[790]: Selected source 167.160.187.12 (2.centos.pool.ntp.org)
Jan 26 07:47:53 np0005595389.novalocal chronyd[790]: System clock TAI offset set to 37 seconds
Jan 26 07:47:53 np0005595389.novalocal dracut[1270]: *** Including module: systemd ***
Jan 26 07:47:53 np0005595389.novalocal dracut[1270]: *** Including module: fips ***
Jan 26 07:47:53 np0005595389.novalocal dracut[1270]: *** Including module: systemd-initrd ***
Jan 26 07:47:53 np0005595389.novalocal dracut[1270]: *** Including module: i18n ***
Jan 26 07:47:54 np0005595389.novalocal dracut[1270]: *** Including module: drm ***
Jan 26 07:47:54 np0005595389.novalocal dracut[1270]: *** Including module: prefixdevname ***
Jan 26 07:47:54 np0005595389.novalocal dracut[1270]: *** Including module: kernel-modules ***
Jan 26 07:47:54 np0005595389.novalocal kernel: block vda: the capability attribute has been deprecated.
Jan 26 07:47:55 np0005595389.novalocal irqbalance[785]: Cannot change IRQ 25 affinity: Operation not permitted
Jan 26 07:47:55 np0005595389.novalocal irqbalance[785]: IRQ 25 affinity is now unmanaged
Jan 26 07:47:55 np0005595389.novalocal dracut[1270]: *** Including module: kernel-modules-extra ***
Jan 26 07:47:55 np0005595389.novalocal irqbalance[785]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 26 07:47:55 np0005595389.novalocal irqbalance[785]: IRQ 31 affinity is now unmanaged
Jan 26 07:47:55 np0005595389.novalocal irqbalance[785]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 26 07:47:55 np0005595389.novalocal irqbalance[785]: IRQ 28 affinity is now unmanaged
Jan 26 07:47:55 np0005595389.novalocal irqbalance[785]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 26 07:47:55 np0005595389.novalocal irqbalance[785]: IRQ 32 affinity is now unmanaged
Jan 26 07:47:55 np0005595389.novalocal irqbalance[785]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 26 07:47:55 np0005595389.novalocal irqbalance[785]: IRQ 30 affinity is now unmanaged
Jan 26 07:47:55 np0005595389.novalocal irqbalance[785]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 26 07:47:55 np0005595389.novalocal irqbalance[785]: IRQ 29 affinity is now unmanaged
Jan 26 07:47:55 np0005595389.novalocal dracut[1270]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Jan 26 07:47:55 np0005595389.novalocal dracut[1270]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Jan 26 07:47:55 np0005595389.novalocal dracut[1270]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Jan 26 07:47:55 np0005595389.novalocal dracut[1270]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Jan 26 07:47:55 np0005595389.novalocal dracut[1270]: *** Including module: qemu ***
Jan 26 07:47:55 np0005595389.novalocal dracut[1270]: *** Including module: fstab-sys ***
Jan 26 07:47:55 np0005595389.novalocal dracut[1270]: *** Including module: rootfs-block ***
Jan 26 07:47:55 np0005595389.novalocal dracut[1270]: *** Including module: terminfo ***
Jan 26 07:47:55 np0005595389.novalocal dracut[1270]: *** Including module: udev-rules ***
Jan 26 07:47:56 np0005595389.novalocal dracut[1270]: Skipping udev rule: 91-permissions.rules
Jan 26 07:47:56 np0005595389.novalocal dracut[1270]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 26 07:47:56 np0005595389.novalocal dracut[1270]: *** Including module: virtiofs ***
Jan 26 07:47:56 np0005595389.novalocal dracut[1270]: *** Including module: dracut-systemd ***
Jan 26 07:47:56 np0005595389.novalocal dracut[1270]: *** Including module: usrmount ***
Jan 26 07:47:56 np0005595389.novalocal dracut[1270]: *** Including module: base ***
Jan 26 07:47:56 np0005595389.novalocal dracut[1270]: *** Including module: fs-lib ***
Jan 26 07:47:56 np0005595389.novalocal dracut[1270]: *** Including module: kdumpbase ***
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:   microcode_ctl module: mangling fw_dir
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: configuration "intel" is ignored
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 26 07:47:57 np0005595389.novalocal dracut[1270]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 26 07:47:58 np0005595389.novalocal dracut[1270]: *** Including module: openssl ***
Jan 26 07:47:58 np0005595389.novalocal dracut[1270]: *** Including module: shutdown ***
Jan 26 07:47:58 np0005595389.novalocal dracut[1270]: *** Including module: squash ***
Jan 26 07:47:58 np0005595389.novalocal dracut[1270]: *** Including modules done ***
Jan 26 07:47:58 np0005595389.novalocal dracut[1270]: *** Installing kernel module dependencies ***
Jan 26 07:47:58 np0005595389.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 07:47:59 np0005595389.novalocal dracut[1270]: *** Installing kernel module dependencies done ***
Jan 26 07:47:59 np0005595389.novalocal dracut[1270]: *** Resolving executable dependencies ***
Jan 26 07:48:00 np0005595389.novalocal dracut[1270]: *** Resolving executable dependencies done ***
Jan 26 07:48:00 np0005595389.novalocal dracut[1270]: *** Generating early-microcode cpio image ***
Jan 26 07:48:00 np0005595389.novalocal dracut[1270]: *** Store current command line parameters ***
Jan 26 07:48:00 np0005595389.novalocal dracut[1270]: Stored kernel commandline:
Jan 26 07:48:00 np0005595389.novalocal dracut[1270]: No dracut internal kernel commandline stored in the initramfs
Jan 26 07:48:00 np0005595389.novalocal dracut[1270]: *** Install squash loader ***
Jan 26 07:48:01 np0005595389.novalocal dracut[1270]: *** Squashing the files inside the initramfs ***
Jan 26 07:48:02 np0005595389.novalocal dracut[1270]: *** Squashing the files inside the initramfs done ***
Jan 26 07:48:02 np0005595389.novalocal dracut[1270]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 26 07:48:03 np0005595389.novalocal dracut[1270]: *** Hardlinking files ***
Jan 26 07:48:03 np0005595389.novalocal dracut[1270]: Mode:           real
Jan 26 07:48:03 np0005595389.novalocal dracut[1270]: Files:          50
Jan 26 07:48:03 np0005595389.novalocal dracut[1270]: Linked:         0 files
Jan 26 07:48:03 np0005595389.novalocal dracut[1270]: Compared:       0 xattrs
Jan 26 07:48:03 np0005595389.novalocal dracut[1270]: Compared:       0 files
Jan 26 07:48:03 np0005595389.novalocal dracut[1270]: Saved:          0 B
Jan 26 07:48:03 np0005595389.novalocal dracut[1270]: Duration:       0.000521 seconds
Jan 26 07:48:03 np0005595389.novalocal dracut[1270]: *** Hardlinking files done ***
Jan 26 07:48:03 np0005595389.novalocal dracut[1270]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 26 07:48:03 np0005595389.novalocal kdumpctl[1015]: kdump: kexec: loaded kdump kernel
Jan 26 07:48:03 np0005595389.novalocal kdumpctl[1015]: kdump: Starting kdump: [OK]
Jan 26 07:48:03 np0005595389.novalocal systemd[1]: Finished Crash recovery kernel arming.
Jan 26 07:48:03 np0005595389.novalocal systemd[1]: Startup finished in 1.609s (kernel) + 2.668s (initrd) + 20.362s (userspace) = 24.639s.
Jan 26 07:48:10 np0005595389.novalocal sshd-session[4303]: Accepted publickey for zuul from 38.102.83.114 port 35498 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Jan 26 07:48:10 np0005595389.novalocal systemd[1]: Created slice User Slice of UID 1000.
Jan 26 07:48:10 np0005595389.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 26 07:48:10 np0005595389.novalocal systemd-logind[788]: New session 1 of user zuul.
Jan 26 07:48:10 np0005595389.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 26 07:48:10 np0005595389.novalocal systemd[1]: Starting User Manager for UID 1000...
Jan 26 07:48:10 np0005595389.novalocal systemd[4308]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 07:48:11 np0005595389.novalocal systemd[4308]: Queued start job for default target Main User Target.
Jan 26 07:48:11 np0005595389.novalocal systemd[4308]: Created slice User Application Slice.
Jan 26 07:48:11 np0005595389.novalocal systemd[4308]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 26 07:48:11 np0005595389.novalocal systemd[4308]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 07:48:11 np0005595389.novalocal systemd[4308]: Reached target Paths.
Jan 26 07:48:11 np0005595389.novalocal systemd[4308]: Reached target Timers.
Jan 26 07:48:11 np0005595389.novalocal systemd[4308]: Starting D-Bus User Message Bus Socket...
Jan 26 07:48:11 np0005595389.novalocal systemd[4308]: Starting Create User's Volatile Files and Directories...
Jan 26 07:48:11 np0005595389.novalocal systemd[4308]: Listening on D-Bus User Message Bus Socket.
Jan 26 07:48:11 np0005595389.novalocal systemd[4308]: Reached target Sockets.
Jan 26 07:48:11 np0005595389.novalocal systemd[4308]: Finished Create User's Volatile Files and Directories.
Jan 26 07:48:11 np0005595389.novalocal systemd[4308]: Reached target Basic System.
Jan 26 07:48:11 np0005595389.novalocal systemd[4308]: Reached target Main User Target.
Jan 26 07:48:11 np0005595389.novalocal systemd[4308]: Startup finished in 157ms.
Jan 26 07:48:11 np0005595389.novalocal systemd[1]: Started User Manager for UID 1000.
Jan 26 07:48:11 np0005595389.novalocal systemd[1]: Started Session 1 of User zuul.
Jan 26 07:48:11 np0005595389.novalocal sshd-session[4303]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 07:48:11 np0005595389.novalocal python3[4391]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 07:48:16 np0005595389.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 07:48:53 np0005595389.novalocal python3[4421]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 07:48:59 np0005595389.novalocal python3[4479]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 07:49:08 np0005595389.novalocal python3[4519]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 26 07:49:10 np0005595389.novalocal python3[4545]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCjD3RNZcDlfgsHphh91DiH8AHWeKOvhPtPRZCKMtcYGm9lQe0Gp3kcpOQ2LMUMqJPeB4f2UvkPN07/E6b3aq9eH0wu0XFQ0H2VB9kSLlrQajPYM2b9s+ho4XhWrPVqz2i+32CrsMKRpyTluk/VK3OqO10gGioITVIv1Ydq5rrB2vDnG8jWisLvARngdvxhRIqElV870cvhpTnqKgkHkXUioIhJmpO35kYJQdnZwuTP3fZclbx2QRn25ZMXAvo6pqLu28mVgIj6QGVMuZ2zBxDcQEzTTuprRCM2Jcqo86k6C21387T0TEuP28QBJSar6aiP0nKeuwbFZHIP0nsKWUTfWxiPYtj71/JO+DFKngjbk+YjcGlBAn7c4lX8B01nbFVoWR3PjIN3P2AuvPMnFthWa1Zfe0/vHMlfZmD/+QMutgNg7wdUyjlrof1ckQjb07yAWxVvMzbLvZ7WLqyafxP82BYTTciydauAbmFpnFhRDxrGCrKLFjAYECdnG8FFFfU= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:10 np0005595389.novalocal python3[4569]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:49:11 np0005595389.novalocal python3[4668]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 07:49:11 np0005595389.novalocal python3[4739]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769413750.9036257-230-105063185819232/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=bbcbaf2713494f84a0c268a634b27182_id_rsa follow=False checksum=a284ff9abc4c5198dc7309d03946a619bc8fded0 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:49:12 np0005595389.novalocal python3[4862]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 07:49:12 np0005595389.novalocal python3[4933]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769413751.9817908-274-117429885517517/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=bbcbaf2713494f84a0c268a634b27182_id_rsa.pub follow=False checksum=95ab566be4a735de6aeb5c5c270c74913aa5b268 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:49:13 np0005595389.novalocal python3[4981]: ansible-ping Invoked with data=pong
Jan 26 07:49:14 np0005595389.novalocal python3[5005]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 07:49:16 np0005595389.novalocal python3[5063]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 26 07:49:18 np0005595389.novalocal python3[5095]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:49:18 np0005595389.novalocal python3[5119]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:49:18 np0005595389.novalocal python3[5143]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:49:19 np0005595389.novalocal python3[5167]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:49:19 np0005595389.novalocal python3[5191]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:49:19 np0005595389.novalocal python3[5215]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:49:21 np0005595389.novalocal sudo[5239]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ismkiwkddajxuxcsdczeyaxyswidbjhl ; /usr/bin/python3'
Jan 26 07:49:21 np0005595389.novalocal sudo[5239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:49:21 np0005595389.novalocal python3[5241]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:49:21 np0005595389.novalocal sudo[5239]: pam_unix(sudo:session): session closed for user root
Jan 26 07:49:21 np0005595389.novalocal sudo[5317]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drjkazrzvhpphhysbcnazciodbsoagrg ; /usr/bin/python3'
Jan 26 07:49:21 np0005595389.novalocal sudo[5317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:49:21 np0005595389.novalocal python3[5319]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 07:49:21 np0005595389.novalocal sudo[5317]: pam_unix(sudo:session): session closed for user root
Jan 26 07:49:22 np0005595389.novalocal sudo[5390]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frclineunwihadamkpzrytsjcnxfswle ; /usr/bin/python3'
Jan 26 07:49:22 np0005595389.novalocal sudo[5390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:49:22 np0005595389.novalocal python3[5392]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769413761.4618683-27-192237933663016/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:49:22 np0005595389.novalocal sudo[5390]: pam_unix(sudo:session): session closed for user root
Jan 26 07:49:23 np0005595389.novalocal python3[5440]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:23 np0005595389.novalocal python3[5464]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:23 np0005595389.novalocal python3[5488]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:23 np0005595389.novalocal python3[5512]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:24 np0005595389.novalocal python3[5536]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:24 np0005595389.novalocal python3[5560]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:24 np0005595389.novalocal python3[5584]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:24 np0005595389.novalocal python3[5608]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:25 np0005595389.novalocal python3[5632]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:25 np0005595389.novalocal python3[5656]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:25 np0005595389.novalocal python3[5680]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:26 np0005595389.novalocal python3[5704]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:26 np0005595389.novalocal python3[5728]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:26 np0005595389.novalocal python3[5752]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:27 np0005595389.novalocal python3[5776]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:27 np0005595389.novalocal python3[5800]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:27 np0005595389.novalocal python3[5824]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:27 np0005595389.novalocal python3[5848]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:28 np0005595389.novalocal python3[5872]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:28 np0005595389.novalocal python3[5896]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:28 np0005595389.novalocal python3[5920]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:29 np0005595389.novalocal python3[5944]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:29 np0005595389.novalocal python3[5968]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:29 np0005595389.novalocal python3[5992]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:29 np0005595389.novalocal python3[6016]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:30 np0005595389.novalocal python3[6040]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:49:32 np0005595389.novalocal sudo[6064]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkbzzerblfzjypvgfngyqbhcmgqicfyc ; /usr/bin/python3'
Jan 26 07:49:32 np0005595389.novalocal sudo[6064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:49:32 np0005595389.novalocal python3[6066]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 26 07:49:32 np0005595389.novalocal systemd[1]: Starting Time & Date Service...
Jan 26 07:49:32 np0005595389.novalocal systemd[1]: Started Time & Date Service.
Jan 26 07:49:32 np0005595389.novalocal systemd-timedated[6068]: Changed time zone to 'UTC' (UTC).
Jan 26 07:49:32 np0005595389.novalocal sudo[6064]: pam_unix(sudo:session): session closed for user root
Jan 26 07:49:33 np0005595389.novalocal sudo[6095]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmvpzdjlvcgpdagvwcywzmjuhojjwhaa ; /usr/bin/python3'
Jan 26 07:49:33 np0005595389.novalocal sudo[6095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:49:33 np0005595389.novalocal python3[6097]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:49:33 np0005595389.novalocal sudo[6095]: pam_unix(sudo:session): session closed for user root
Jan 26 07:49:33 np0005595389.novalocal python3[6173]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 07:49:34 np0005595389.novalocal python3[6244]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769413773.4779918-203-266124208501741/source _original_basename=tmpojd6ddkr follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:49:34 np0005595389.novalocal python3[6344]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 07:49:34 np0005595389.novalocal python3[6415]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769413774.3820593-243-221259344351324/source _original_basename=tmplu53vpgm follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:49:35 np0005595389.novalocal sudo[6515]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxevouyelthgjfbawgowopsnvkhvkplc ; /usr/bin/python3'
Jan 26 07:49:35 np0005595389.novalocal sudo[6515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:49:35 np0005595389.novalocal python3[6517]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 07:49:35 np0005595389.novalocal sudo[6515]: pam_unix(sudo:session): session closed for user root
Jan 26 07:49:36 np0005595389.novalocal sudo[6588]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuqflxrhzyegdfjlakwiiwagzkfpckib ; /usr/bin/python3'
Jan 26 07:49:36 np0005595389.novalocal sudo[6588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:49:36 np0005595389.novalocal python3[6590]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769413775.6485453-307-49306735062895/source _original_basename=tmpuuowx5rt follow=False checksum=6c462e10cf6b935fb22f4386c31d576dcf4d4133 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:49:36 np0005595389.novalocal sudo[6588]: pam_unix(sudo:session): session closed for user root
Jan 26 07:49:36 np0005595389.novalocal python3[6638]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 07:49:37 np0005595389.novalocal python3[6664]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 07:49:37 np0005595389.novalocal sudo[6742]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txkxfteddrsgzvuljnkgyrdqtnwycjig ; /usr/bin/python3'
Jan 26 07:49:37 np0005595389.novalocal sudo[6742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:49:37 np0005595389.novalocal python3[6744]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 07:49:37 np0005595389.novalocal sudo[6742]: pam_unix(sudo:session): session closed for user root
Jan 26 07:49:38 np0005595389.novalocal sudo[6815]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpqzegiffsvqetmttkfkkdmblprsooom ; /usr/bin/python3'
Jan 26 07:49:38 np0005595389.novalocal sudo[6815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:49:38 np0005595389.novalocal python3[6817]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769413777.4326975-363-280159394465191/source _original_basename=tmprwd4jbj5 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:49:38 np0005595389.novalocal sudo[6815]: pam_unix(sudo:session): session closed for user root
Jan 26 07:49:38 np0005595389.novalocal sudo[6866]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owajwwmbnzzdjqwynrekovtocqqppiph ; /usr/bin/python3'
Jan 26 07:49:38 np0005595389.novalocal sudo[6866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:49:38 np0005595389.novalocal python3[6868]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-4264-ab01-00000000001e-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 07:49:38 np0005595389.novalocal sudo[6866]: pam_unix(sudo:session): session closed for user root
Jan 26 07:49:39 np0005595389.novalocal python3[6896]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-4264-ab01-00000000001f-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 26 07:49:40 np0005595389.novalocal python3[6924]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:49:59 np0005595389.novalocal sudo[6948]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqfsnfwrgasrnhmymfpbyebmxzhbvlvz ; /usr/bin/python3'
Jan 26 07:49:59 np0005595389.novalocal sudo[6948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:49:59 np0005595389.novalocal python3[6950]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:49:59 np0005595389.novalocal sudo[6948]: pam_unix(sudo:session): session closed for user root
Jan 26 07:50:02 np0005595389.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 07:50:29 np0005595389.novalocal sshd-session[6953]: banner exchange: Connection from 65.49.20.67 port 27030: invalid format
Jan 26 07:50:57 np0005595389.novalocal systemd[4308]: Starting Mark boot as successful...
Jan 26 07:50:57 np0005595389.novalocal systemd[4308]: Finished Mark boot as successful.
Jan 26 07:50:59 np0005595389.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 26 07:50:59 np0005595389.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 26 07:50:59 np0005595389.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 26 07:50:59 np0005595389.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 26 07:50:59 np0005595389.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 26 07:50:59 np0005595389.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 26 07:50:59 np0005595389.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x440000000-0x440003fff 64bit pref]: assigned
Jan 26 07:50:59 np0005595389.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 26 07:50:59 np0005595389.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 26 07:50:59 np0005595389.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 26 07:50:59 np0005595389.novalocal NetworkManager[859]: <info>  [1769413859.4311] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 26 07:50:59 np0005595389.novalocal systemd-udevd[6955]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 07:50:59 np0005595389.novalocal NetworkManager[859]: <info>  [1769413859.4464] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 07:50:59 np0005595389.novalocal NetworkManager[859]: <info>  [1769413859.4511] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 26 07:50:59 np0005595389.novalocal NetworkManager[859]: <info>  [1769413859.4519] device (eth1): carrier: link connected
Jan 26 07:50:59 np0005595389.novalocal NetworkManager[859]: <info>  [1769413859.4526] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 26 07:50:59 np0005595389.novalocal NetworkManager[859]: <info>  [1769413859.4540] policy: auto-activating connection 'Wired connection 1' (b992380d-d503-34bf-95a3-75c144beb661)
Jan 26 07:50:59 np0005595389.novalocal NetworkManager[859]: <info>  [1769413859.4549] device (eth1): Activation: starting connection 'Wired connection 1' (b992380d-d503-34bf-95a3-75c144beb661)
Jan 26 07:50:59 np0005595389.novalocal NetworkManager[859]: <info>  [1769413859.4552] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 07:50:59 np0005595389.novalocal NetworkManager[859]: <info>  [1769413859.4561] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 07:50:59 np0005595389.novalocal NetworkManager[859]: <info>  [1769413859.4572] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 07:50:59 np0005595389.novalocal NetworkManager[859]: <info>  [1769413859.4583] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 26 07:50:59 np0005595389.novalocal sshd-session[4318]: Received disconnect from 38.102.83.114 port 35498:11: disconnected by user
Jan 26 07:50:59 np0005595389.novalocal sshd-session[4318]: Disconnected from user zuul 38.102.83.114 port 35498
Jan 26 07:50:59 np0005595389.novalocal sshd-session[4303]: pam_unix(sshd:session): session closed for user zuul
Jan 26 07:50:59 np0005595389.novalocal systemd-logind[788]: Session 1 logged out. Waiting for processes to exit.
Jan 26 07:51:00 np0005595389.novalocal sshd-session[6959]: Accepted publickey for zuul from 38.102.83.114 port 35358 ssh2: RSA SHA256:5iqlXoVGZ9p23AimikXJoJK2JC1HSNh9KX8pshdVcYk
Jan 26 07:51:00 np0005595389.novalocal systemd-logind[788]: New session 3 of user zuul.
Jan 26 07:51:00 np0005595389.novalocal systemd[1]: Started Session 3 of User zuul.
Jan 26 07:51:00 np0005595389.novalocal sshd-session[6959]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 07:51:00 np0005595389.novalocal python3[6986]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-c96b-4623-000000000173-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 07:51:07 np0005595389.novalocal sudo[7064]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taicoqwpufgbqgbeahfhpsoylkgolrcu ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 07:51:07 np0005595389.novalocal sudo[7064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:51:07 np0005595389.novalocal python3[7066]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 07:51:07 np0005595389.novalocal sudo[7064]: pam_unix(sudo:session): session closed for user root
Jan 26 07:51:07 np0005595389.novalocal sudo[7137]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhjrnlrqwjhftvzltakhzbibcupbswin ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 07:51:07 np0005595389.novalocal sudo[7137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:51:07 np0005595389.novalocal python3[7139]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769413867.0688765-154-9524736223973/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=ce6aada1dbc0f7aa611f9b1c65f522064026f349 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:51:07 np0005595389.novalocal sudo[7137]: pam_unix(sudo:session): session closed for user root
Jan 26 07:51:08 np0005595389.novalocal sudo[7187]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmicurugvctniuvnqahbhrbsczscisso ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 07:51:08 np0005595389.novalocal sudo[7187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:51:08 np0005595389.novalocal python3[7189]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 07:51:08 np0005595389.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 26 07:51:08 np0005595389.novalocal systemd[1]: Stopped Network Manager Wait Online.
Jan 26 07:51:08 np0005595389.novalocal systemd[1]: Stopping Network Manager Wait Online...
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[859]: <info>  [1769413868.3257] caught SIGTERM, shutting down normally.
Jan 26 07:51:08 np0005595389.novalocal systemd[1]: Stopping Network Manager...
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[859]: <info>  [1769413868.3265] dhcp4 (eth0): canceled DHCP transaction
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[859]: <info>  [1769413868.3265] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[859]: <info>  [1769413868.3265] dhcp4 (eth0): state changed no lease
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[859]: <info>  [1769413868.3267] manager: NetworkManager state is now CONNECTING
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[859]: <info>  [1769413868.3441] dhcp4 (eth1): canceled DHCP transaction
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[859]: <info>  [1769413868.3441] dhcp4 (eth1): state changed no lease
Jan 26 07:51:08 np0005595389.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[859]: <info>  [1769413868.3507] exiting (success)
Jan 26 07:51:08 np0005595389.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 07:51:08 np0005595389.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 26 07:51:08 np0005595389.novalocal systemd[1]: Stopped Network Manager.
Jan 26 07:51:08 np0005595389.novalocal systemd[1]: NetworkManager.service: Consumed 1.345s CPU time, 9.8M memory peak.
Jan 26 07:51:08 np0005595389.novalocal systemd[1]: Starting Network Manager...
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.4316] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:ac323890-fec5-4361-852a-4b7b8dc1d6fe)
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.4319] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.4412] manager[0x55911bb9d000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 26 07:51:08 np0005595389.novalocal systemd[1]: Starting Hostname Service...
Jan 26 07:51:08 np0005595389.novalocal systemd[1]: Started Hostname Service.
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5596] hostname: hostname: using hostnamed
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5597] hostname: static hostname changed from (none) to "np0005595389.novalocal"
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5605] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5612] manager[0x55911bb9d000]: rfkill: Wi-Fi hardware radio set enabled
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5614] manager[0x55911bb9d000]: rfkill: WWAN hardware radio set enabled
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5662] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5663] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5665] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5667] manager: Networking is enabled by state file
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5672] settings: Loaded settings plugin: keyfile (internal)
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5679] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5720] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5735] dhcp: init: Using DHCP client 'internal'
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5740] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5750] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5758] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5772] device (lo): Activation: starting connection 'lo' (b8f0ada8-b516-476a-9a84-27149369f44b)
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5783] device (eth0): carrier: link connected
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5791] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5799] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5801] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5813] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5823] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5829] device (eth1): carrier: link connected
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5835] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5840] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (b992380d-d503-34bf-95a3-75c144beb661) (indicated)
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5841] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5846] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5854] device (eth1): Activation: starting connection 'Wired connection 1' (b992380d-d503-34bf-95a3-75c144beb661)
Jan 26 07:51:08 np0005595389.novalocal systemd[1]: Started Network Manager.
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5860] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5867] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5872] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5876] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5879] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5884] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5888] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5892] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5897] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5905] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5910] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5922] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5928] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5947] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5956] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5962] device (lo): Activation: successful, device activated.
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5970] dhcp4 (eth0): state changed new lease, address=38.102.83.74
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.5978] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 26 07:51:08 np0005595389.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.6054] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.6077] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.6080] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.6085] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.6088] device (eth0): Activation: successful, device activated.
Jan 26 07:51:08 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413868.6095] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 26 07:51:08 np0005595389.novalocal sudo[7187]: pam_unix(sudo:session): session closed for user root
Jan 26 07:51:09 np0005595389.novalocal python3[7273]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-c96b-4623-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 07:51:18 np0005595389.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 07:51:38 np0005595389.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.3409] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 07:51:54 np0005595389.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 07:51:54 np0005595389.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.3782] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.3786] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.3795] device (eth1): Activation: successful, device activated.
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.3804] manager: startup complete
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.3806] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <warn>  [1769413914.3809] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.3816] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 26 07:51:54 np0005595389.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.3931] dhcp4 (eth1): canceled DHCP transaction
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.3932] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.3932] dhcp4 (eth1): state changed no lease
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.3951] policy: auto-activating connection 'ci-private-network' (5afdbf09-b387-55a7-bf0b-677a9446200b)
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.3955] device (eth1): Activation: starting connection 'ci-private-network' (5afdbf09-b387-55a7-bf0b-677a9446200b)
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.3957] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.3960] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.3968] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.3978] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.4022] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.4023] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 07:51:54 np0005595389.novalocal NetworkManager[7201]: <info>  [1769413914.4032] device (eth1): Activation: successful, device activated.
Jan 26 07:52:04 np0005595389.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 07:52:09 np0005595389.novalocal sshd-session[6962]: Received disconnect from 38.102.83.114 port 35358:11: disconnected by user
Jan 26 07:52:09 np0005595389.novalocal sshd-session[6962]: Disconnected from user zuul 38.102.83.114 port 35358
Jan 26 07:52:09 np0005595389.novalocal sshd-session[6959]: pam_unix(sshd:session): session closed for user zuul
Jan 26 07:52:09 np0005595389.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Jan 26 07:52:09 np0005595389.novalocal systemd[1]: session-3.scope: Consumed 1.640s CPU time.
Jan 26 07:52:09 np0005595389.novalocal systemd-logind[788]: Session 3 logged out. Waiting for processes to exit.
Jan 26 07:52:09 np0005595389.novalocal systemd-logind[788]: Removed session 3.
Jan 26 07:52:12 np0005595389.novalocal sshd-session[7302]: Accepted publickey for zuul from 38.102.83.114 port 32912 ssh2: RSA SHA256:5iqlXoVGZ9p23AimikXJoJK2JC1HSNh9KX8pshdVcYk
Jan 26 07:52:12 np0005595389.novalocal systemd-logind[788]: New session 4 of user zuul.
Jan 26 07:52:12 np0005595389.novalocal systemd[1]: Started Session 4 of User zuul.
Jan 26 07:52:12 np0005595389.novalocal sshd-session[7302]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 07:52:12 np0005595389.novalocal sudo[7381]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhgbvnystyhihqtbdmuggcsqerexexlx ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 07:52:12 np0005595389.novalocal sudo[7381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:52:12 np0005595389.novalocal python3[7383]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 07:52:12 np0005595389.novalocal sudo[7381]: pam_unix(sudo:session): session closed for user root
Jan 26 07:52:12 np0005595389.novalocal sudo[7454]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbvdczflwdbbzuucdjnshefbsbyurwcs ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 07:52:12 np0005595389.novalocal sudo[7454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:52:13 np0005595389.novalocal python3[7456]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769413932.5242782-312-87618190534505/source _original_basename=tmpn2uuriuq follow=False checksum=5cc69e32546a5a4cfb5fa02e854e02dce40bfe84 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:52:13 np0005595389.novalocal sudo[7454]: pam_unix(sudo:session): session closed for user root
Jan 26 07:52:15 np0005595389.novalocal sshd-session[7305]: Connection closed by 38.102.83.114 port 32912
Jan 26 07:52:15 np0005595389.novalocal sshd-session[7302]: pam_unix(sshd:session): session closed for user zuul
Jan 26 07:52:15 np0005595389.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Jan 26 07:52:15 np0005595389.novalocal systemd-logind[788]: Session 4 logged out. Waiting for processes to exit.
Jan 26 07:52:15 np0005595389.novalocal systemd-logind[788]: Removed session 4.
Jan 26 07:53:57 np0005595389.novalocal systemd[4308]: Created slice User Background Tasks Slice.
Jan 26 07:53:57 np0005595389.novalocal systemd[4308]: Starting Cleanup of User's Temporary Files and Directories...
Jan 26 07:53:57 np0005595389.novalocal systemd[4308]: Finished Cleanup of User's Temporary Files and Directories.
Jan 26 07:56:44 np0005595389.novalocal sshd-session[7485]: userauth_pubkey: signature algorithm ssh-rsa not in PubkeyAcceptedAlgorithms [preauth]
Jan 26 07:56:54 np0005595389.novalocal sshd-session[7485]: Connection closed by authenticating user root 139.19.117.130 port 42852 [preauth]
Jan 26 07:58:26 np0005595389.novalocal sshd-session[7488]: Accepted publickey for zuul from 38.102.83.114 port 56962 ssh2: RSA SHA256:5iqlXoVGZ9p23AimikXJoJK2JC1HSNh9KX8pshdVcYk
Jan 26 07:58:26 np0005595389.novalocal systemd-logind[788]: New session 5 of user zuul.
Jan 26 07:58:27 np0005595389.novalocal systemd[1]: Started Session 5 of User zuul.
Jan 26 07:58:27 np0005595389.novalocal sshd-session[7488]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 07:58:27 np0005595389.novalocal sudo[7515]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vavqwmlckpldqigpyyyxajppxcjxofzo ; /usr/bin/python3'
Jan 26 07:58:27 np0005595389.novalocal sudo[7515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:58:27 np0005595389.novalocal python3[7517]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-aad1-334a-00000000216e-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 07:58:27 np0005595389.novalocal sudo[7515]: pam_unix(sudo:session): session closed for user root
Jan 26 07:58:27 np0005595389.novalocal sudo[7544]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msahozacunvacjcoovxcipobqdduxsjn ; /usr/bin/python3'
Jan 26 07:58:27 np0005595389.novalocal sudo[7544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:58:27 np0005595389.novalocal python3[7546]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:58:27 np0005595389.novalocal sudo[7544]: pam_unix(sudo:session): session closed for user root
Jan 26 07:58:27 np0005595389.novalocal sudo[7570]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqmwozdbibssrtsuldpevgvammwjfjbu ; /usr/bin/python3'
Jan 26 07:58:27 np0005595389.novalocal sudo[7570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:58:27 np0005595389.novalocal python3[7572]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:58:27 np0005595389.novalocal sudo[7570]: pam_unix(sudo:session): session closed for user root
Jan 26 07:58:28 np0005595389.novalocal sudo[7596]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvetsvbwzsttuuqipqudooywipdgbzhz ; /usr/bin/python3'
Jan 26 07:58:28 np0005595389.novalocal sudo[7596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:58:28 np0005595389.novalocal python3[7598]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:58:28 np0005595389.novalocal sudo[7596]: pam_unix(sudo:session): session closed for user root
Jan 26 07:58:28 np0005595389.novalocal sudo[7622]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhkpthpwvoqinimkunurongcqetpqnhy ; /usr/bin/python3'
Jan 26 07:58:28 np0005595389.novalocal sudo[7622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:58:28 np0005595389.novalocal python3[7624]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:58:28 np0005595389.novalocal sudo[7622]: pam_unix(sudo:session): session closed for user root
Jan 26 07:58:28 np0005595389.novalocal sudo[7648]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpiqpewkydgwxmztvfhydvbowyhnxflb ; /usr/bin/python3'
Jan 26 07:58:28 np0005595389.novalocal sudo[7648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:58:28 np0005595389.novalocal python3[7650]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:58:28 np0005595389.novalocal sudo[7648]: pam_unix(sudo:session): session closed for user root
Jan 26 07:58:29 np0005595389.novalocal sudo[7726]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sunhvflhjizutkiemwkvgiwkkeqbjiqg ; /usr/bin/python3'
Jan 26 07:58:29 np0005595389.novalocal sudo[7726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:58:29 np0005595389.novalocal python3[7728]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 07:58:29 np0005595389.novalocal sudo[7726]: pam_unix(sudo:session): session closed for user root
Jan 26 07:58:29 np0005595389.novalocal sudo[7799]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsnhuicnbcptczdskauogmeqskfobvbe ; /usr/bin/python3'
Jan 26 07:58:29 np0005595389.novalocal sudo[7799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:58:29 np0005595389.novalocal python3[7801]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769414309.083326-524-7704583000309/source _original_basename=tmpmplgpjhn follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:58:29 np0005595389.novalocal sudo[7799]: pam_unix(sudo:session): session closed for user root
Jan 26 07:58:30 np0005595389.novalocal sudo[7849]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmywcxxhlbydmnbgqnqnkmqkwvlkqwkg ; /usr/bin/python3'
Jan 26 07:58:30 np0005595389.novalocal sudo[7849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:58:30 np0005595389.novalocal python3[7851]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 07:58:30 np0005595389.novalocal systemd[1]: Reloading.
Jan 26 07:58:30 np0005595389.novalocal systemd-rc-local-generator[7868]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 07:58:30 np0005595389.novalocal sudo[7849]: pam_unix(sudo:session): session closed for user root
Jan 26 07:58:32 np0005595389.novalocal sudo[7904]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikhicoyblarvheltditchilphxsiwvgy ; /usr/bin/python3'
Jan 26 07:58:32 np0005595389.novalocal sudo[7904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:58:32 np0005595389.novalocal python3[7906]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 26 07:58:32 np0005595389.novalocal sudo[7904]: pam_unix(sudo:session): session closed for user root
Jan 26 07:58:32 np0005595389.novalocal sudo[7930]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umrdglvyxjvcdpjdsdselsxlauwbyelm ; /usr/bin/python3'
Jan 26 07:58:32 np0005595389.novalocal sudo[7930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:58:32 np0005595389.novalocal python3[7932]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 07:58:32 np0005595389.novalocal sudo[7930]: pam_unix(sudo:session): session closed for user root
Jan 26 07:58:32 np0005595389.novalocal sudo[7958]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nediqqnaojdzshlnptuqkprjdpfjmavt ; /usr/bin/python3'
Jan 26 07:58:32 np0005595389.novalocal sudo[7958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:58:32 np0005595389.novalocal python3[7960]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 07:58:32 np0005595389.novalocal sudo[7958]: pam_unix(sudo:session): session closed for user root
Jan 26 07:58:33 np0005595389.novalocal sudo[7986]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivbzrsasolcygoncfaufmyhpwiismzvk ; /usr/bin/python3'
Jan 26 07:58:33 np0005595389.novalocal sudo[7986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:58:33 np0005595389.novalocal python3[7988]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 07:58:33 np0005595389.novalocal sudo[7986]: pam_unix(sudo:session): session closed for user root
Jan 26 07:58:33 np0005595389.novalocal sudo[8014]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edpkrnnphgtbgzsyositlslsqcuvqegu ; /usr/bin/python3'
Jan 26 07:58:33 np0005595389.novalocal sudo[8014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:58:33 np0005595389.novalocal python3[8016]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 07:58:33 np0005595389.novalocal sudo[8014]: pam_unix(sudo:session): session closed for user root
Jan 26 07:58:33 np0005595389.novalocal python3[8043]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-aad1-334a-000000002175-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 07:58:34 np0005595389.novalocal python3[8073]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 26 07:58:37 np0005595389.novalocal sshd-session[7491]: Connection closed by 38.102.83.114 port 56962
Jan 26 07:58:37 np0005595389.novalocal sshd-session[7488]: pam_unix(sshd:session): session closed for user zuul
Jan 26 07:58:37 np0005595389.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Jan 26 07:58:37 np0005595389.novalocal systemd[1]: session-5.scope: Consumed 4.250s CPU time.
Jan 26 07:58:37 np0005595389.novalocal systemd-logind[788]: Session 5 logged out. Waiting for processes to exit.
Jan 26 07:58:37 np0005595389.novalocal systemd-logind[788]: Removed session 5.
Jan 26 07:58:38 np0005595389.novalocal sshd-session[8080]: Accepted publickey for zuul from 38.102.83.114 port 44258 ssh2: RSA SHA256:5iqlXoVGZ9p23AimikXJoJK2JC1HSNh9KX8pshdVcYk
Jan 26 07:58:38 np0005595389.novalocal systemd-logind[788]: New session 6 of user zuul.
Jan 26 07:58:38 np0005595389.novalocal systemd[1]: Started Session 6 of User zuul.
Jan 26 07:58:38 np0005595389.novalocal sshd-session[8080]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 07:58:39 np0005595389.novalocal sudo[8107]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bczqeypfrfzaarsyfbciqgbqgzljttzw ; /usr/bin/python3'
Jan 26 07:58:39 np0005595389.novalocal sudo[8107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:58:39 np0005595389.novalocal python3[8109]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 26 07:58:44 np0005595389.novalocal setsebool[8153]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 26 07:58:44 np0005595389.novalocal setsebool[8153]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 26 07:58:55 np0005595389.novalocal kernel: SELinux:  Converting 385 SID table entries...
Jan 26 07:58:55 np0005595389.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 07:58:55 np0005595389.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 26 07:58:55 np0005595389.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 07:58:55 np0005595389.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 26 07:58:55 np0005595389.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 07:58:55 np0005595389.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 07:58:55 np0005595389.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 07:59:04 np0005595389.novalocal kernel: SELinux:  Converting 388 SID table entries...
Jan 26 07:59:04 np0005595389.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 07:59:04 np0005595389.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 26 07:59:04 np0005595389.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 07:59:04 np0005595389.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 26 07:59:04 np0005595389.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 07:59:04 np0005595389.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 07:59:04 np0005595389.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 07:59:21 np0005595389.novalocal dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 26 07:59:21 np0005595389.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 07:59:21 np0005595389.novalocal systemd[1]: Starting man-db-cache-update.service...
Jan 26 07:59:21 np0005595389.novalocal systemd[1]: Reloading.
Jan 26 07:59:22 np0005595389.novalocal systemd-rc-local-generator[8924]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 07:59:22 np0005595389.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 07:59:23 np0005595389.novalocal sudo[8107]: pam_unix(sudo:session): session closed for user root
Jan 26 07:59:30 np0005595389.novalocal python3[14762]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-6d46-1ad1-00000000000b-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 07:59:31 np0005595389.novalocal kernel: evm: overlay not supported
Jan 26 07:59:31 np0005595389.novalocal systemd[4308]: Starting D-Bus User Message Bus...
Jan 26 07:59:31 np0005595389.novalocal dbus-broker-launch[15273]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 26 07:59:31 np0005595389.novalocal dbus-broker-launch[15273]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 26 07:59:31 np0005595389.novalocal systemd[4308]: Started D-Bus User Message Bus.
Jan 26 07:59:31 np0005595389.novalocal dbus-broker-lau[15273]: Ready
Jan 26 07:59:31 np0005595389.novalocal systemd[4308]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 26 07:59:31 np0005595389.novalocal systemd[4308]: Created slice Slice /user.
Jan 26 07:59:31 np0005595389.novalocal systemd[4308]: podman-15200.scope: unit configures an IP firewall, but not running as root.
Jan 26 07:59:31 np0005595389.novalocal systemd[4308]: (This warning is only shown for the first unit using IP firewalling.)
Jan 26 07:59:31 np0005595389.novalocal systemd[4308]: Started podman-15200.scope.
Jan 26 07:59:32 np0005595389.novalocal systemd[4308]: Started podman-pause-2438c25c.scope.
Jan 26 07:59:32 np0005595389.novalocal sudo[15556]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqglbgclzrppzpclrdupyqdoqfqjbpcs ; /usr/bin/python3'
Jan 26 07:59:32 np0005595389.novalocal sudo[15556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:59:32 np0005595389.novalocal python3[15569]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.46:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.46:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 07:59:32 np0005595389.novalocal python3[15569]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 26 07:59:32 np0005595389.novalocal sudo[15556]: pam_unix(sudo:session): session closed for user root
Jan 26 07:59:32 np0005595389.novalocal sshd-session[8083]: Connection closed by 38.102.83.114 port 44258
Jan 26 07:59:32 np0005595389.novalocal sshd-session[8080]: pam_unix(sshd:session): session closed for user zuul
Jan 26 07:59:32 np0005595389.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Jan 26 07:59:32 np0005595389.novalocal systemd[1]: session-6.scope: Consumed 41.472s CPU time.
Jan 26 07:59:32 np0005595389.novalocal systemd-logind[788]: Session 6 logged out. Waiting for processes to exit.
Jan 26 07:59:32 np0005595389.novalocal systemd-logind[788]: Removed session 6.
Jan 26 07:59:45 np0005595389.novalocal irqbalance[785]: Cannot change IRQ 27 affinity: Operation not permitted
Jan 26 07:59:45 np0005595389.novalocal irqbalance[785]: IRQ 27 affinity is now unmanaged
Jan 26 07:59:53 np0005595389.novalocal sshd-session[23922]: Unable to negotiate with 38.102.83.107 port 60080: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 26 07:59:53 np0005595389.novalocal sshd-session[23928]: Connection closed by 38.102.83.107 port 60054 [preauth]
Jan 26 07:59:53 np0005595389.novalocal sshd-session[23929]: Connection closed by 38.102.83.107 port 60068 [preauth]
Jan 26 07:59:53 np0005595389.novalocal sshd-session[23926]: Unable to negotiate with 38.102.83.107 port 60084: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 26 07:59:53 np0005595389.novalocal sshd-session[23923]: Unable to negotiate with 38.102.83.107 port 60086: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 26 07:59:56 np0005595389.novalocal sshd-session[25376]: Accepted publickey for zuul from 38.102.83.114 port 53020 ssh2: RSA SHA256:5iqlXoVGZ9p23AimikXJoJK2JC1HSNh9KX8pshdVcYk
Jan 26 07:59:57 np0005595389.novalocal systemd-logind[788]: New session 7 of user zuul.
Jan 26 07:59:57 np0005595389.novalocal systemd[1]: Started Session 7 of User zuul.
Jan 26 07:59:57 np0005595389.novalocal sshd-session[25376]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 07:59:57 np0005595389.novalocal python3[25481]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGQDuQ67cRHe7SRGG/zoIaBQ9IV1R/MoILTNKSydLhYrJyLGxcpJiWr249ItZJpf2CKi6GC8HHj/uFQvueMepLM= zuul@np0005595387.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:59:57 np0005595389.novalocal sudo[25632]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zycmlaxwuxeljcnooljtmwndztabznaj ; /usr/bin/python3'
Jan 26 07:59:57 np0005595389.novalocal sudo[25632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:59:57 np0005595389.novalocal python3[25640]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGQDuQ67cRHe7SRGG/zoIaBQ9IV1R/MoILTNKSydLhYrJyLGxcpJiWr249ItZJpf2CKi6GC8HHj/uFQvueMepLM= zuul@np0005595387.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:59:57 np0005595389.novalocal sudo[25632]: pam_unix(sudo:session): session closed for user root
Jan 26 07:59:58 np0005595389.novalocal sudo[25914]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upqdhvnuxdunzsytyruylkinitbkfymp ; /usr/bin/python3'
Jan 26 07:59:58 np0005595389.novalocal sudo[25914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:59:58 np0005595389.novalocal python3[25925]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005595389.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 26 07:59:58 np0005595389.novalocal useradd[26000]: new group: name=cloud-admin, GID=1002
Jan 26 07:59:58 np0005595389.novalocal useradd[26000]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Jan 26 07:59:58 np0005595389.novalocal sudo[25914]: pam_unix(sudo:session): session closed for user root
Jan 26 07:59:58 np0005595389.novalocal sudo[26131]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxjbfuwcusfawigqbiikwgjfprsovvpu ; /usr/bin/python3'
Jan 26 07:59:58 np0005595389.novalocal sudo[26131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:59:59 np0005595389.novalocal python3[26144]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGQDuQ67cRHe7SRGG/zoIaBQ9IV1R/MoILTNKSydLhYrJyLGxcpJiWr249ItZJpf2CKi6GC8HHj/uFQvueMepLM= zuul@np0005595387.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 07:59:59 np0005595389.novalocal sudo[26131]: pam_unix(sudo:session): session closed for user root
Jan 26 07:59:59 np0005595389.novalocal sudo[26366]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwkchgikedonboiartvhelcwhltcufns ; /usr/bin/python3'
Jan 26 07:59:59 np0005595389.novalocal sudo[26366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 07:59:59 np0005595389.novalocal python3[26375]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 07:59:59 np0005595389.novalocal sudo[26366]: pam_unix(sudo:session): session closed for user root
Jan 26 07:59:59 np0005595389.novalocal sudo[26557]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbbbwfblyjpehgtcaycpiegtsddcldxz ; /usr/bin/python3'
Jan 26 07:59:59 np0005595389.novalocal sudo[26557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:00:00 np0005595389.novalocal python3[26567]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769414399.2774937-152-252823985230864/source _original_basename=tmp1tpqvjfl follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:00:00 np0005595389.novalocal sudo[26557]: pam_unix(sudo:session): session closed for user root
Jan 26 08:00:00 np0005595389.novalocal sudo[26843]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbtzjuoqofpedhihpmnxsmmdvvlhsgiw ; /usr/bin/python3'
Jan 26 08:00:00 np0005595389.novalocal sudo[26843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:00:00 np0005595389.novalocal python3[26850]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Jan 26 08:00:00 np0005595389.novalocal systemd[1]: Starting Hostname Service...
Jan 26 08:00:01 np0005595389.novalocal systemd[1]: Started Hostname Service.
Jan 26 08:00:01 np0005595389.novalocal systemd-hostnamed[26929]: Changed pretty hostname to 'compute-1'
Jan 26 08:00:01 compute-1 systemd-hostnamed[26929]: Hostname set to <compute-1> (static)
Jan 26 08:00:01 compute-1 NetworkManager[7201]: <info>  [1769414401.0992] hostname: static hostname changed from "np0005595389.novalocal" to "compute-1"
Jan 26 08:00:01 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 08:00:01 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 08:00:01 compute-1 sudo[26843]: pam_unix(sudo:session): session closed for user root
Jan 26 08:00:01 compute-1 sshd-session[25419]: Connection closed by 38.102.83.114 port 53020
Jan 26 08:00:01 compute-1 sshd-session[25376]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:00:01 compute-1 systemd[1]: session-7.scope: Deactivated successfully.
Jan 26 08:00:01 compute-1 systemd[1]: session-7.scope: Consumed 2.552s CPU time.
Jan 26 08:00:01 compute-1 systemd-logind[788]: Session 7 logged out. Waiting for processes to exit.
Jan 26 08:00:01 compute-1 systemd-logind[788]: Removed session 7.
Jan 26 08:00:10 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 08:00:10 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 08:00:10 compute-1 systemd[1]: man-db-cache-update.service: Consumed 57.400s CPU time.
Jan 26 08:00:10 compute-1 systemd[1]: run-rdf265c6c745045e69e2e2580f97e4b59.service: Deactivated successfully.
Jan 26 08:00:11 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 08:00:31 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 08:01:01 compute-1 CROND[29964]: (root) CMD (run-parts /etc/cron.hourly)
Jan 26 08:01:01 compute-1 run-parts[29967]: (/etc/cron.hourly) starting 0anacron
Jan 26 08:01:01 compute-1 anacron[29975]: Anacron started on 2026-01-26
Jan 26 08:01:01 compute-1 anacron[29975]: Will run job `cron.daily' in 42 min.
Jan 26 08:01:01 compute-1 anacron[29975]: Will run job `cron.weekly' in 62 min.
Jan 26 08:01:01 compute-1 anacron[29975]: Will run job `cron.monthly' in 82 min.
Jan 26 08:01:01 compute-1 anacron[29975]: Jobs will be executed sequentially
Jan 26 08:01:01 compute-1 run-parts[29977]: (/etc/cron.hourly) finished 0anacron
Jan 26 08:01:01 compute-1 CROND[29963]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 26 08:01:14 compute-1 sshd-session[29978]: Connection closed by 47.236.2.142 port 51546
Jan 26 08:02:57 compute-1 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 26 08:02:57 compute-1 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 26 08:02:57 compute-1 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 26 08:02:57 compute-1 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 26 08:03:42 compute-1 sshd-session[29983]: Accepted publickey for zuul from 38.102.83.107 port 56342 ssh2: RSA SHA256:5iqlXoVGZ9p23AimikXJoJK2JC1HSNh9KX8pshdVcYk
Jan 26 08:03:42 compute-1 systemd-logind[788]: New session 8 of user zuul.
Jan 26 08:03:42 compute-1 systemd[1]: Started Session 8 of User zuul.
Jan 26 08:03:42 compute-1 sshd-session[29983]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:03:42 compute-1 python3[30059]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:03:44 compute-1 sudo[30173]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxnirkncrxxlwodrujjofpzylbmrswtx ; /usr/bin/python3'
Jan 26 08:03:44 compute-1 sudo[30173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:03:44 compute-1 python3[30175]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 08:03:44 compute-1 sudo[30173]: pam_unix(sudo:session): session closed for user root
Jan 26 08:03:45 compute-1 sudo[30246]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgeixlvokplatnqsvostdkqzwrlmcwpk ; /usr/bin/python3'
Jan 26 08:03:45 compute-1 sudo[30246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:03:45 compute-1 python3[30248]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769414624.4097228-34106-122902241409382/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:03:45 compute-1 sudo[30246]: pam_unix(sudo:session): session closed for user root
Jan 26 08:03:45 compute-1 sudo[30272]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krocrodlagljmwuzsgrxpncaeicmuuma ; /usr/bin/python3'
Jan 26 08:03:45 compute-1 sudo[30272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:03:45 compute-1 python3[30274]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 08:03:45 compute-1 sudo[30272]: pam_unix(sudo:session): session closed for user root
Jan 26 08:03:45 compute-1 sudo[30345]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhoacqhqghcmogsrcbfugnugxevkvtrn ; /usr/bin/python3'
Jan 26 08:03:45 compute-1 sudo[30345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:03:45 compute-1 python3[30347]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769414624.4097228-34106-122902241409382/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:03:45 compute-1 sudo[30345]: pam_unix(sudo:session): session closed for user root
Jan 26 08:03:45 compute-1 sudo[30371]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aewmwgcydlmiksfejhnlfeiqboeibdzg ; /usr/bin/python3'
Jan 26 08:03:45 compute-1 sudo[30371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:03:46 compute-1 python3[30373]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 08:03:46 compute-1 sudo[30371]: pam_unix(sudo:session): session closed for user root
Jan 26 08:03:46 compute-1 sudo[30444]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnmlzkskcmpezaibrlwckbtgszvvmukm ; /usr/bin/python3'
Jan 26 08:03:46 compute-1 sudo[30444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:03:47 compute-1 python3[30446]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769414624.4097228-34106-122902241409382/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:03:47 compute-1 sudo[30444]: pam_unix(sudo:session): session closed for user root
Jan 26 08:03:47 compute-1 sudo[30470]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdbxcfavmlfsctbkybhncagbvcqljbqs ; /usr/bin/python3'
Jan 26 08:03:47 compute-1 sudo[30470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:03:47 compute-1 python3[30472]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 08:03:47 compute-1 sudo[30470]: pam_unix(sudo:session): session closed for user root
Jan 26 08:03:47 compute-1 sudo[30543]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhthpkehezwlpbkqvhposhldwwdpkmrp ; /usr/bin/python3'
Jan 26 08:03:47 compute-1 sudo[30543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:03:47 compute-1 python3[30545]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769414624.4097228-34106-122902241409382/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:03:47 compute-1 sudo[30543]: pam_unix(sudo:session): session closed for user root
Jan 26 08:03:47 compute-1 sudo[30569]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foffjmxvjudzdlvytdtidfxhykavpfxy ; /usr/bin/python3'
Jan 26 08:03:47 compute-1 sudo[30569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:03:48 compute-1 python3[30571]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 08:03:48 compute-1 sudo[30569]: pam_unix(sudo:session): session closed for user root
Jan 26 08:03:48 compute-1 sudo[30642]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-subytmscnuxcuwpjlbwzudltsciuncbf ; /usr/bin/python3'
Jan 26 08:03:48 compute-1 sudo[30642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:03:48 compute-1 python3[30644]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769414624.4097228-34106-122902241409382/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:03:48 compute-1 sudo[30642]: pam_unix(sudo:session): session closed for user root
Jan 26 08:03:48 compute-1 sudo[30668]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbbcchoonmknvlvaxgpwukvypwyyblux ; /usr/bin/python3'
Jan 26 08:03:48 compute-1 sudo[30668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:03:48 compute-1 python3[30670]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 08:03:48 compute-1 sudo[30668]: pam_unix(sudo:session): session closed for user root
Jan 26 08:03:48 compute-1 sudo[30741]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fenoodtpmkvubkrccdaxmcokbpjqnwco ; /usr/bin/python3'
Jan 26 08:03:48 compute-1 sudo[30741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:03:49 compute-1 python3[30743]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769414624.4097228-34106-122902241409382/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:03:49 compute-1 sudo[30741]: pam_unix(sudo:session): session closed for user root
Jan 26 08:03:49 compute-1 sudo[30767]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtsggzljtulgjmagzysjfmrrhlhdakrv ; /usr/bin/python3'
Jan 26 08:03:49 compute-1 sudo[30767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:03:49 compute-1 python3[30769]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 08:03:49 compute-1 sudo[30767]: pam_unix(sudo:session): session closed for user root
Jan 26 08:03:49 compute-1 sudo[30840]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mealtjalgxrxgryhkkagmfchrtrjanxm ; /usr/bin/python3'
Jan 26 08:03:49 compute-1 sudo[30840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:03:49 compute-1 python3[30842]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769414624.4097228-34106-122902241409382/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:03:49 compute-1 sudo[30840]: pam_unix(sudo:session): session closed for user root
Jan 26 08:04:02 compute-1 python3[30890]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:09:02 compute-1 sshd-session[29986]: Received disconnect from 38.102.83.107 port 56342:11: disconnected by user
Jan 26 08:09:02 compute-1 sshd-session[29986]: Disconnected from user zuul 38.102.83.107 port 56342
Jan 26 08:09:02 compute-1 sshd-session[29983]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:09:02 compute-1 systemd[1]: session-8.scope: Deactivated successfully.
Jan 26 08:09:02 compute-1 systemd[1]: session-8.scope: Consumed 5.184s CPU time.
Jan 26 08:09:02 compute-1 systemd-logind[788]: Session 8 logged out. Waiting for processes to exit.
Jan 26 08:09:02 compute-1 systemd-logind[788]: Removed session 8.
Jan 26 08:16:15 compute-1 sshd-session[30900]: Accepted publickey for zuul from 192.168.122.30 port 39030 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:16:15 compute-1 systemd-logind[788]: New session 9 of user zuul.
Jan 26 08:16:15 compute-1 systemd[1]: Started Session 9 of User zuul.
Jan 26 08:16:15 compute-1 sshd-session[30900]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:16:16 compute-1 python3.9[31053]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:16:18 compute-1 sudo[31232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwkfkkgcjlprxvbkmyoivdorizoxguge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415377.5611868-40-160517976748009/AnsiballZ_command.py'
Jan 26 08:16:18 compute-1 sudo[31232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:16:18 compute-1 python3.9[31234]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:16:27 compute-1 sudo[31232]: pam_unix(sudo:session): session closed for user root
Jan 26 08:16:27 compute-1 sshd-session[30903]: Connection closed by 192.168.122.30 port 39030
Jan 26 08:16:27 compute-1 sshd-session[30900]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:16:27 compute-1 systemd[1]: session-9.scope: Deactivated successfully.
Jan 26 08:16:27 compute-1 systemd[1]: session-9.scope: Consumed 8.152s CPU time.
Jan 26 08:16:27 compute-1 systemd-logind[788]: Session 9 logged out. Waiting for processes to exit.
Jan 26 08:16:27 compute-1 systemd-logind[788]: Removed session 9.
Jan 26 08:16:32 compute-1 sshd-session[31291]: Accepted publickey for zuul from 192.168.122.30 port 39510 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:16:32 compute-1 systemd-logind[788]: New session 10 of user zuul.
Jan 26 08:16:32 compute-1 systemd[1]: Started Session 10 of User zuul.
Jan 26 08:16:32 compute-1 sshd-session[31291]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:16:33 compute-1 python3.9[31444]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:16:33 compute-1 sshd-session[31294]: Connection closed by 192.168.122.30 port 39510
Jan 26 08:16:33 compute-1 sshd-session[31291]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:16:33 compute-1 systemd[1]: session-10.scope: Deactivated successfully.
Jan 26 08:16:33 compute-1 systemd-logind[788]: Session 10 logged out. Waiting for processes to exit.
Jan 26 08:16:33 compute-1 systemd-logind[788]: Removed session 10.
Jan 26 08:16:49 compute-1 sshd-session[31471]: Accepted publickey for zuul from 192.168.122.30 port 38120 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:16:49 compute-1 systemd-logind[788]: New session 11 of user zuul.
Jan 26 08:16:49 compute-1 systemd[1]: Started Session 11 of User zuul.
Jan 26 08:16:49 compute-1 sshd-session[31471]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:16:50 compute-1 python3.9[31624]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 26 08:16:51 compute-1 python3.9[31798]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:16:52 compute-1 sudo[31948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntklpgdwmqtzsvlaquizbtvimqpinaov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415411.6786158-65-28348321663635/AnsiballZ_command.py'
Jan 26 08:16:52 compute-1 sudo[31948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:16:52 compute-1 python3.9[31950]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:16:52 compute-1 sudo[31948]: pam_unix(sudo:session): session closed for user root
Jan 26 08:16:53 compute-1 sudo[32101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjdfnhunkoabnawoghunszvhsjhrixkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415412.6251662-89-121018215977136/AnsiballZ_stat.py'
Jan 26 08:16:53 compute-1 sudo[32101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:16:53 compute-1 python3.9[32103]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:16:53 compute-1 sudo[32101]: pam_unix(sudo:session): session closed for user root
Jan 26 08:16:53 compute-1 sudo[32253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeysztdpxrkbbkxaskkimvzfgmuzhclg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415413.3664098-105-257828643225431/AnsiballZ_file.py'
Jan 26 08:16:53 compute-1 sudo[32253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:16:54 compute-1 python3.9[32255]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:16:54 compute-1 sudo[32253]: pam_unix(sudo:session): session closed for user root
Jan 26 08:16:54 compute-1 sudo[32405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzhoueiktjpkirvsxgjbyqtvvvixmfzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415414.266262-121-99331989202097/AnsiballZ_stat.py'
Jan 26 08:16:54 compute-1 sudo[32405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:16:54 compute-1 python3.9[32407]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:16:54 compute-1 sudo[32405]: pam_unix(sudo:session): session closed for user root
Jan 26 08:16:55 compute-1 sudo[32528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evawovhtvbwsqipvqzdpbcnydwrzhnrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415414.266262-121-99331989202097/AnsiballZ_copy.py'
Jan 26 08:16:55 compute-1 sudo[32528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:16:55 compute-1 python3.9[32530]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769415414.266262-121-99331989202097/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:16:55 compute-1 sudo[32528]: pam_unix(sudo:session): session closed for user root
Jan 26 08:16:55 compute-1 sudo[32680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umaziehhdceuxgavtdcotytceqkbtxlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415415.6824577-151-107871043138611/AnsiballZ_setup.py'
Jan 26 08:16:55 compute-1 sudo[32680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:16:56 compute-1 python3.9[32682]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:16:56 compute-1 sudo[32680]: pam_unix(sudo:session): session closed for user root
Jan 26 08:16:56 compute-1 sudo[32836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlujtgshqutmmwalbolpqwwfohssywzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415416.6137364-167-150284064965565/AnsiballZ_file.py'
Jan 26 08:16:56 compute-1 sudo[32836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:16:57 compute-1 python3.9[32838]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:16:57 compute-1 sudo[32836]: pam_unix(sudo:session): session closed for user root
Jan 26 08:16:57 compute-1 sudo[32988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrlabcqcjdjafjruvftxgcgcppvxdylk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415417.3598723-185-236286304468797/AnsiballZ_file.py'
Jan 26 08:16:57 compute-1 sudo[32988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:16:57 compute-1 python3.9[32990]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:16:57 compute-1 sudo[32988]: pam_unix(sudo:session): session closed for user root
Jan 26 08:16:58 compute-1 python3.9[33140]: ansible-ansible.builtin.service_facts Invoked
Jan 26 08:17:02 compute-1 python3.9[33393]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:17:02 compute-1 python3.9[33543]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:17:04 compute-1 python3.9[33697]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:17:04 compute-1 sudo[33853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihoxzmcqrgfpxhrrsxcmkhvawotniiew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415424.508493-281-84732612553709/AnsiballZ_setup.py'
Jan 26 08:17:04 compute-1 sudo[33853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:17:05 compute-1 python3.9[33855]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 08:17:05 compute-1 sudo[33853]: pam_unix(sudo:session): session closed for user root
Jan 26 08:17:05 compute-1 sudo[33937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqlgiqjkruchkirezaxqqzrzixldaevl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415424.508493-281-84732612553709/AnsiballZ_dnf.py'
Jan 26 08:17:05 compute-1 sudo[33937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:17:05 compute-1 python3.9[33939]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 08:17:52 compute-1 systemd[1]: Reloading.
Jan 26 08:17:52 compute-1 systemd-rc-local-generator[34134]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:17:52 compute-1 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 26 08:17:52 compute-1 systemd[1]: Reloading.
Jan 26 08:17:52 compute-1 systemd-rc-local-generator[34174]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:17:53 compute-1 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 26 08:17:53 compute-1 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 26 08:17:53 compute-1 systemd[1]: Reloading.
Jan 26 08:17:53 compute-1 systemd-rc-local-generator[34214]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:17:53 compute-1 systemd[1]: Starting dnf makecache...
Jan 26 08:17:53 compute-1 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 26 08:17:53 compute-1 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 26 08:17:53 compute-1 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 26 08:17:53 compute-1 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 26 08:17:53 compute-1 dnf[34227]: Failed determining last makecache time.
Jan 26 08:17:53 compute-1 dnf[34227]: delorean-openstack-barbican-42b4c41831408a8e323 148 kB/s | 3.0 kB     00:00
Jan 26 08:17:53 compute-1 dnf[34227]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 181 kB/s | 3.0 kB     00:00
Jan 26 08:17:53 compute-1 dnf[34227]: delorean-openstack-cinder-1c00d6490d88e436f26ef 174 kB/s | 3.0 kB     00:00
Jan 26 08:17:53 compute-1 dnf[34227]: delorean-python-stevedore-c4acc5639fd2329372142 183 kB/s | 3.0 kB     00:00
Jan 26 08:17:53 compute-1 dnf[34227]: delorean-python-cloudkitty-tests-tempest-2c80f8 147 kB/s | 3.0 kB     00:00
Jan 26 08:17:54 compute-1 dnf[34227]: delorean-os-refresh-config-9bfc52b5049be2d8de61 6.9 kB/s | 3.0 kB     00:00
Jan 26 08:17:54 compute-1 dnf[34227]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 150 kB/s | 3.0 kB     00:00
Jan 26 08:17:54 compute-1 dnf[34227]: delorean-python-designate-tests-tempest-347fdbc 193 kB/s | 3.0 kB     00:00
Jan 26 08:17:54 compute-1 dnf[34227]: delorean-openstack-glance-1fd12c29b339f30fe823e 186 kB/s | 3.0 kB     00:00
Jan 26 08:17:54 compute-1 dnf[34227]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 192 kB/s | 3.0 kB     00:00
Jan 26 08:17:54 compute-1 dnf[34227]: delorean-openstack-manila-3c01b7181572c95dac462 198 kB/s | 3.0 kB     00:00
Jan 26 08:17:54 compute-1 dnf[34227]: delorean-python-whitebox-neutron-tests-tempest- 196 kB/s | 3.0 kB     00:00
Jan 26 08:17:54 compute-1 dnf[34227]: delorean-openstack-octavia-ba397f07a7331190208c 175 kB/s | 3.0 kB     00:00
Jan 26 08:17:54 compute-1 dnf[34227]: delorean-openstack-watcher-c014f81a8647287f6dcc 187 kB/s | 3.0 kB     00:00
Jan 26 08:17:54 compute-1 dnf[34227]: delorean-ansible-config_template-5ccaa22121a7ff 189 kB/s | 3.0 kB     00:00
Jan 26 08:17:54 compute-1 dnf[34227]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 197 kB/s | 3.0 kB     00:00
Jan 26 08:17:54 compute-1 dnf[34227]: delorean-openstack-swift-dc98a8463506ac520c469a 187 kB/s | 3.0 kB     00:00
Jan 26 08:17:54 compute-1 dnf[34227]: delorean-python-tempestconf-8515371b7cceebd4282 197 kB/s | 3.0 kB     00:00
Jan 26 08:17:54 compute-1 dnf[34227]: delorean-openstack-heat-ui-013accbfd179753bc3f0 183 kB/s | 3.0 kB     00:00
Jan 26 08:17:54 compute-1 dnf[34227]: CentOS Stream 9 - BaseOS                         65 kB/s | 6.7 kB     00:00
Jan 26 08:17:54 compute-1 dnf[34227]: CentOS Stream 9 - AppStream                      67 kB/s | 6.8 kB     00:00
Jan 26 08:17:54 compute-1 dnf[34227]: CentOS Stream 9 - CRB                            64 kB/s | 6.6 kB     00:00
Jan 26 08:17:55 compute-1 dnf[34227]: CentOS Stream 9 - Extras packages                31 kB/s | 7.3 kB     00:00
Jan 26 08:17:55 compute-1 dnf[34227]: dlrn-antelope-testing                           121 kB/s | 3.0 kB     00:00
Jan 26 08:17:55 compute-1 dnf[34227]: dlrn-antelope-build-deps                         89 kB/s | 3.0 kB     00:00
Jan 26 08:17:55 compute-1 dnf[34227]: centos9-rabbitmq                                112 kB/s | 3.0 kB     00:00
Jan 26 08:17:55 compute-1 dnf[34227]: centos9-storage                                 109 kB/s | 3.0 kB     00:00
Jan 26 08:17:55 compute-1 dnf[34227]: centos9-opstools                                 76 kB/s | 3.0 kB     00:00
Jan 26 08:17:55 compute-1 dnf[34227]: NFV SIG OpenvSwitch                             122 kB/s | 3.0 kB     00:00
Jan 26 08:17:55 compute-1 dnf[34227]: repo-setup-centos-appstream                     195 kB/s | 4.4 kB     00:00
Jan 26 08:17:55 compute-1 dnf[34227]: repo-setup-centos-baseos                        191 kB/s | 3.9 kB     00:00
Jan 26 08:17:55 compute-1 dnf[34227]: repo-setup-centos-highavailability              150 kB/s | 3.9 kB     00:00
Jan 26 08:17:55 compute-1 dnf[34227]: repo-setup-centos-powertools                    179 kB/s | 4.3 kB     00:00
Jan 26 08:17:55 compute-1 dnf[34227]: Extra Packages for Enterprise Linux 9 - x86_64   34 kB/s |  10 kB     00:00
Jan 26 08:17:56 compute-1 dnf[34227]: Metadata cache created.
Jan 26 08:17:56 compute-1 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 26 08:17:56 compute-1 systemd[1]: Finished dnf makecache.
Jan 26 08:17:56 compute-1 systemd[1]: dnf-makecache.service: Consumed 1.798s CPU time.
Jan 26 08:19:02 compute-1 kernel: SELinux:  Converting 2724 SID table entries...
Jan 26 08:19:02 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 08:19:02 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 26 08:19:02 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 08:19:02 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 26 08:19:02 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 08:19:02 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 08:19:02 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 08:19:03 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 26 08:19:03 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 08:19:03 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 08:19:03 compute-1 systemd[1]: Reloading.
Jan 26 08:19:03 compute-1 systemd-rc-local-generator[34590]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:19:03 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 08:19:03 compute-1 sudo[33937]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:04 compute-1 sudo[35503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoxmnvofcsbduhcngveasynxevdsslpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415544.1886241-305-9877748103706/AnsiballZ_command.py'
Jan 26 08:19:04 compute-1 sudo[35503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:04 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 08:19:04 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 08:19:04 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.303s CPU time.
Jan 26 08:19:04 compute-1 systemd[1]: run-rde10f2cff7d44a1ca6ef65c4f75ad147.service: Deactivated successfully.
Jan 26 08:19:04 compute-1 python3.9[35505]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:19:05 compute-1 sudo[35503]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:06 compute-1 sudo[35785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haticiyfmnkaygjexcwuqfoesriognqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415545.9095566-321-225044152288036/AnsiballZ_selinux.py'
Jan 26 08:19:06 compute-1 sudo[35785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:06 compute-1 python3.9[35787]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 26 08:19:06 compute-1 sudo[35785]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:07 compute-1 sudo[35937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syuvoqzuxtelbvezapayntiutrwfhjwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415547.2779331-343-70787006583192/AnsiballZ_command.py'
Jan 26 08:19:07 compute-1 sudo[35937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:07 compute-1 python3.9[35939]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 26 08:19:09 compute-1 sudo[35937]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:10 compute-1 sudo[36090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oorcynvyelnjzuibioqniyikhucnfdbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415549.812355-359-25897177420841/AnsiballZ_file.py'
Jan 26 08:19:10 compute-1 sudo[36090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:10 compute-1 python3.9[36092]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:19:10 compute-1 sudo[36090]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:11 compute-1 sudo[36242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evklzhszqnesghpaffqikfjmfozniqwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415550.8052638-376-271424059422582/AnsiballZ_mount.py'
Jan 26 08:19:11 compute-1 sudo[36242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:11 compute-1 python3.9[36244]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 26 08:19:11 compute-1 sudo[36242]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:12 compute-1 sudo[36394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drciakslurlwastmxycuhwccjqfulnfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415552.4938347-431-83908607076380/AnsiballZ_file.py'
Jan 26 08:19:12 compute-1 sudo[36394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:12 compute-1 python3.9[36396]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:19:12 compute-1 sudo[36394]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:13 compute-1 sudo[36546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwrzsmmbuotxffoamgsbjaxgaxcjcksw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415553.3512025-447-243633104186732/AnsiballZ_stat.py'
Jan 26 08:19:13 compute-1 sudo[36546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:13 compute-1 python3.9[36548]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:19:14 compute-1 sudo[36546]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:14 compute-1 sudo[36669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hisieozxaaaikxvimfasnvwfoielckkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415553.3512025-447-243633104186732/AnsiballZ_copy.py'
Jan 26 08:19:14 compute-1 sudo[36669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:14 compute-1 python3.9[36671]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415553.3512025-447-243633104186732/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=70539153ba8394cf6e23efa475ecc78b911f2b37 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:19:14 compute-1 sudo[36669]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:15 compute-1 sudo[36821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odneskyjawdbxupueijspxyvtmmihjqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415555.1379788-495-208127149642906/AnsiballZ_stat.py'
Jan 26 08:19:15 compute-1 sudo[36821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:15 compute-1 python3.9[36823]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:19:15 compute-1 sudo[36821]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:16 compute-1 sudo[36973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybpkplhdonwxbjkzzlkytgpkhcpezezs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415555.805315-511-59537401838066/AnsiballZ_command.py'
Jan 26 08:19:16 compute-1 sudo[36973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:16 compute-1 python3.9[36975]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:19:16 compute-1 sudo[36973]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:16 compute-1 sudo[37126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgyxnaqxtcbpvtaobxgqhboejcevumxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415556.5551426-527-88028391567960/AnsiballZ_file.py'
Jan 26 08:19:16 compute-1 sudo[37126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:17 compute-1 python3.9[37128]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:19:17 compute-1 sudo[37126]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:17 compute-1 sudo[37278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovzpoqzvbqzybzefufoqmedfyhzsyyxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415557.4632995-549-232742921875990/AnsiballZ_getent.py'
Jan 26 08:19:17 compute-1 sudo[37278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:18 compute-1 python3.9[37280]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 26 08:19:18 compute-1 sudo[37278]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:18 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 08:19:18 compute-1 sudo[37432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhmfkkfpwxfezyoioqepmigfhjknbsoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415558.396246-565-145317740082284/AnsiballZ_group.py'
Jan 26 08:19:18 compute-1 sudo[37432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:19 compute-1 python3.9[37434]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 08:19:19 compute-1 groupadd[37435]: group added to /etc/group: name=qemu, GID=107
Jan 26 08:19:19 compute-1 groupadd[37435]: group added to /etc/gshadow: name=qemu
Jan 26 08:19:19 compute-1 groupadd[37435]: new group: name=qemu, GID=107
Jan 26 08:19:19 compute-1 sudo[37432]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:20 compute-1 sudo[37590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvyuaxbipiqjonlwdymlodeplrbvwxko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415559.5686042-581-75783797536878/AnsiballZ_user.py'
Jan 26 08:19:20 compute-1 sudo[37590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:20 compute-1 python3.9[37592]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 08:19:20 compute-1 useradd[37594]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Jan 26 08:19:20 compute-1 sudo[37590]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:20 compute-1 sudo[37750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sczxakvpiwawonqhjrxvjyjltnrrjbwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415560.551567-597-158101444782484/AnsiballZ_getent.py'
Jan 26 08:19:20 compute-1 sudo[37750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:21 compute-1 python3.9[37752]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 26 08:19:21 compute-1 sudo[37750]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:21 compute-1 sudo[37903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbojvezfcqqngbzalgyykoxsvdhmatjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415561.284621-613-146014519675842/AnsiballZ_group.py'
Jan 26 08:19:21 compute-1 sudo[37903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:21 compute-1 python3.9[37905]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 08:19:21 compute-1 groupadd[37906]: group added to /etc/group: name=hugetlbfs, GID=42477
Jan 26 08:19:21 compute-1 groupadd[37906]: group added to /etc/gshadow: name=hugetlbfs
Jan 26 08:19:21 compute-1 groupadd[37906]: new group: name=hugetlbfs, GID=42477
Jan 26 08:19:21 compute-1 sudo[37903]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:22 compute-1 sudo[38061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeshrtfyxxhqgdgmgeqqcwwxvuwxmljt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415562.1821089-631-187150612568732/AnsiballZ_file.py'
Jan 26 08:19:22 compute-1 sudo[38061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:22 compute-1 python3.9[38063]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 26 08:19:22 compute-1 sudo[38061]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:23 compute-1 sudo[38213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxvkjmxjbhreuhsbvpipxniadvkpvtkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415563.0187974-653-44824933595755/AnsiballZ_dnf.py'
Jan 26 08:19:23 compute-1 sudo[38213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:23 compute-1 python3.9[38215]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 08:19:31 compute-1 sudo[38213]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:32 compute-1 sudo[38368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iatahyauvraikamgazjumrtpivzsftlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415571.911472-669-264079327960312/AnsiballZ_file.py'
Jan 26 08:19:32 compute-1 sudo[38368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:32 compute-1 python3.9[38370]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:19:32 compute-1 sudo[38368]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:32 compute-1 sudo[38520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upcluphnxwxbahwebkggshiwbkulgyvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415572.606126-685-210205422725498/AnsiballZ_stat.py'
Jan 26 08:19:32 compute-1 sudo[38520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:33 compute-1 python3.9[38522]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:19:33 compute-1 sudo[38520]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:33 compute-1 sudo[38643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlfsfichnhbtsfitostmfsfidqlpzfqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415572.606126-685-210205422725498/AnsiballZ_copy.py'
Jan 26 08:19:33 compute-1 sudo[38643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:33 compute-1 python3.9[38645]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769415572.606126-685-210205422725498/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:19:33 compute-1 sudo[38643]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:34 compute-1 sudo[38795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bztzonumrobtfilhgruhteovengrqafv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415573.9843338-715-148886101507768/AnsiballZ_systemd.py'
Jan 26 08:19:34 compute-1 sudo[38795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:35 compute-1 python3.9[38797]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 08:19:35 compute-1 systemd[1]: Starting Load Kernel Modules...
Jan 26 08:19:35 compute-1 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 26 08:19:35 compute-1 kernel: Bridge firewalling registered
Jan 26 08:19:35 compute-1 systemd-modules-load[38801]: Inserted module 'br_netfilter'
Jan 26 08:19:35 compute-1 systemd[1]: Finished Load Kernel Modules.
Jan 26 08:19:35 compute-1 sudo[38795]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:35 compute-1 sudo[38954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kywynmvffedlcyyrlrsyumehazpdglbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415575.6296318-731-61581224764870/AnsiballZ_stat.py'
Jan 26 08:19:35 compute-1 sudo[38954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:36 compute-1 python3.9[38956]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:19:36 compute-1 sudo[38954]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:36 compute-1 sudo[39077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zavaxpqrlmjuccqeqhjhiqczbccowqii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415575.6296318-731-61581224764870/AnsiballZ_copy.py'
Jan 26 08:19:36 compute-1 sudo[39077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:36 compute-1 python3.9[39079]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769415575.6296318-731-61581224764870/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:19:36 compute-1 sudo[39077]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:37 compute-1 sudo[39229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qczvddwzicqdlyiegovptdvxpqlfwmme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415577.0514784-767-211907311556191/AnsiballZ_dnf.py'
Jan 26 08:19:37 compute-1 sudo[39229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:37 compute-1 python3.9[39231]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 08:19:40 compute-1 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 26 08:19:40 compute-1 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 26 08:19:40 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 08:19:40 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 08:19:40 compute-1 systemd[1]: Reloading.
Jan 26 08:19:40 compute-1 systemd-rc-local-generator[39291]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:19:40 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 08:19:41 compute-1 sudo[39229]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:42 compute-1 python3.9[40766]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:19:43 compute-1 python3.9[41633]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 26 08:19:43 compute-1 python3.9[42334]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:19:44 compute-1 sudo[43083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jotspljevhhfwwwsinlbjjshejpsibls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415584.2711618-845-153946119513729/AnsiballZ_command.py'
Jan 26 08:19:44 compute-1 sudo[43083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:44 compute-1 python3.9[43111]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:19:45 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 26 08:19:45 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 08:19:45 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 08:19:45 compute-1 systemd[1]: man-db-cache-update.service: Consumed 5.671s CPU time.
Jan 26 08:19:45 compute-1 systemd[1]: run-r76d3f94dfcce4c94b25650dcf83f315c.service: Deactivated successfully.
Jan 26 08:19:45 compute-1 systemd[1]: Starting Authorization Manager...
Jan 26 08:19:45 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 26 08:19:45 compute-1 polkitd[43618]: Started polkitd version 0.117
Jan 26 08:19:45 compute-1 polkitd[43618]: Loading rules from directory /etc/polkit-1/rules.d
Jan 26 08:19:45 compute-1 polkitd[43618]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 26 08:19:45 compute-1 polkitd[43618]: Finished loading, compiling and executing 2 rules
Jan 26 08:19:45 compute-1 systemd[1]: Started Authorization Manager.
Jan 26 08:19:45 compute-1 polkitd[43618]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Jan 26 08:19:45 compute-1 sudo[43083]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:46 compute-1 sudo[43786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzcoyudmgfaykuxygqlnztcmonwpeozm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415586.623861-863-219126017642159/AnsiballZ_systemd.py'
Jan 26 08:19:47 compute-1 sudo[43786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:47 compute-1 python3.9[43788]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:19:47 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 26 08:19:47 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Jan 26 08:19:47 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 26 08:19:47 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 26 08:19:47 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 26 08:19:47 compute-1 sudo[43786]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:48 compute-1 python3.9[43950]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 26 08:19:51 compute-1 sudo[44100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qipkltvncrgrzwcjenmaxdmnrplizzxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415590.7576349-977-196198171771573/AnsiballZ_systemd.py'
Jan 26 08:19:51 compute-1 sudo[44100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:51 compute-1 python3.9[44102]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:19:51 compute-1 systemd[1]: Reloading.
Jan 26 08:19:51 compute-1 systemd-rc-local-generator[44131]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:19:51 compute-1 sudo[44100]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:52 compute-1 sudo[44289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toypcnbjidalwpwvbfyhkotszzgcaamn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415591.8971972-977-80750969505062/AnsiballZ_systemd.py'
Jan 26 08:19:52 compute-1 sudo[44289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:52 compute-1 python3.9[44291]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:19:52 compute-1 systemd[1]: Reloading.
Jan 26 08:19:52 compute-1 systemd-rc-local-generator[44322]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:19:52 compute-1 sudo[44289]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:53 compute-1 sudo[44479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frfhzhcbkfnjyjnjzbrpladhzrkwhjrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415592.9721973-1009-244178570710479/AnsiballZ_command.py'
Jan 26 08:19:53 compute-1 sudo[44479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:53 compute-1 python3.9[44481]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:19:53 compute-1 sudo[44479]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:53 compute-1 sudo[44632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnixtyuzcsywtgouprfdpchvltegtxkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415593.648639-1025-217931114083305/AnsiballZ_command.py'
Jan 26 08:19:53 compute-1 sudo[44632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:54 compute-1 python3.9[44634]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:19:54 compute-1 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 26 08:19:54 compute-1 sudo[44632]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:54 compute-1 sudo[44785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbbpsboobvevndyaisfmwuclitobgfjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415594.4204047-1041-117503005710629/AnsiballZ_command.py'
Jan 26 08:19:54 compute-1 sudo[44785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:54 compute-1 python3.9[44787]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:19:56 compute-1 sudo[44785]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:57 compute-1 sudo[44947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huvxlpqwddsalardrjhgqdhvpfvvxira ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415596.7080665-1057-245899452062315/AnsiballZ_command.py'
Jan 26 08:19:57 compute-1 sudo[44947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:57 compute-1 python3.9[44949]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:19:57 compute-1 sudo[44947]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:57 compute-1 sudo[45100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urueoibwcapooqaewnyiwdkzxmkfqgig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415597.4414244-1073-232218688655150/AnsiballZ_systemd.py'
Jan 26 08:19:57 compute-1 sudo[45100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:19:58 compute-1 python3.9[45102]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 08:19:58 compute-1 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 26 08:19:58 compute-1 systemd[1]: Stopped Apply Kernel Variables.
Jan 26 08:19:58 compute-1 systemd[1]: Stopping Apply Kernel Variables...
Jan 26 08:19:58 compute-1 systemd[1]: Starting Apply Kernel Variables...
Jan 26 08:19:58 compute-1 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 26 08:19:58 compute-1 systemd[1]: Finished Apply Kernel Variables.
Jan 26 08:19:58 compute-1 sudo[45100]: pam_unix(sudo:session): session closed for user root
Jan 26 08:19:58 compute-1 sshd-session[31474]: Connection closed by 192.168.122.30 port 38120
Jan 26 08:19:58 compute-1 sshd-session[31471]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:19:58 compute-1 systemd[1]: session-11.scope: Deactivated successfully.
Jan 26 08:19:58 compute-1 systemd[1]: session-11.scope: Consumed 2min 22.319s CPU time.
Jan 26 08:19:58 compute-1 systemd-logind[788]: Session 11 logged out. Waiting for processes to exit.
Jan 26 08:19:58 compute-1 systemd-logind[788]: Removed session 11.
Jan 26 08:20:03 compute-1 sshd-session[45133]: Accepted publickey for zuul from 192.168.122.30 port 57326 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:20:03 compute-1 systemd-logind[788]: New session 12 of user zuul.
Jan 26 08:20:03 compute-1 systemd[1]: Started Session 12 of User zuul.
Jan 26 08:20:03 compute-1 sshd-session[45133]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:20:04 compute-1 python3.9[45286]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:20:05 compute-1 python3.9[45440]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:20:06 compute-1 sudo[45594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfdxverwxvmxwvwyhbkwldquyqxyebmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415606.4436426-76-44391921208912/AnsiballZ_command.py'
Jan 26 08:20:06 compute-1 sudo[45594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:07 compute-1 python3.9[45596]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:20:07 compute-1 sudo[45594]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:08 compute-1 python3.9[45747]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:20:08 compute-1 sudo[45901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txebaforrsormrikxkxoityrfrsrvuyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415608.648298-116-84032095964849/AnsiballZ_setup.py'
Jan 26 08:20:08 compute-1 sudo[45901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:09 compute-1 python3.9[45903]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 08:20:09 compute-1 sudo[45901]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:09 compute-1 sudo[45985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzbxfmpdparyutvsasxihwkoafbilnfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415608.648298-116-84032095964849/AnsiballZ_dnf.py'
Jan 26 08:20:09 compute-1 sudo[45985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:10 compute-1 python3.9[45987]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 08:20:11 compute-1 sudo[45985]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:12 compute-1 sudo[46138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjggepzsbnynssvjlcdegfbtwhhqoihw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415611.7183619-140-27705574573321/AnsiballZ_setup.py'
Jan 26 08:20:12 compute-1 sudo[46138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:12 compute-1 python3.9[46140]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 08:20:12 compute-1 sudo[46138]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:13 compute-1 sudo[46309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdilnyweqpszjjrjqedceliaguvszjmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415612.9103413-162-141188466504963/AnsiballZ_file.py'
Jan 26 08:20:13 compute-1 sudo[46309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:13 compute-1 python3.9[46311]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:20:13 compute-1 sudo[46309]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:14 compute-1 sudo[46461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pelpfxspuxdxddhihveygtlhayrnqfui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415613.8814075-178-66165841992886/AnsiballZ_command.py'
Jan 26 08:20:14 compute-1 sudo[46461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:14 compute-1 python3.9[46463]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:20:14 compute-1 podman[46464]: 2026-01-26 08:20:14.530453024 +0000 UTC m=+0.051425971 system refresh
Jan 26 08:20:14 compute-1 sudo[46461]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:15 compute-1 sudo[46624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jntbpkophjchvmstofryvrbpqwqvmcno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415614.8168874-194-15328868924998/AnsiballZ_stat.py'
Jan 26 08:20:15 compute-1 sudo[46624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:15 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:20:15 compute-1 python3.9[46626]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:20:15 compute-1 sudo[46624]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:16 compute-1 sudo[46747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdhcuwixivszhqfbbsnctrioovtsctce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415614.8168874-194-15328868924998/AnsiballZ_copy.py'
Jan 26 08:20:16 compute-1 sudo[46747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:16 compute-1 python3.9[46749]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415614.8168874-194-15328868924998/.source.json follow=False _original_basename=podman_network_config.j2 checksum=d8df7e2c4bf6906bbb47f268807dab502200f369 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:20:16 compute-1 sudo[46747]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:17 compute-1 sudo[46899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzpamamnwpnkhfvzhiuheaxsbxztcuct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415616.6885383-224-219898643257368/AnsiballZ_stat.py'
Jan 26 08:20:17 compute-1 sudo[46899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:17 compute-1 python3.9[46901]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:20:17 compute-1 sudo[46899]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:17 compute-1 sudo[47022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsgpqzhvqtgtdclywblrkdykeowlhpzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415616.6885383-224-219898643257368/AnsiballZ_copy.py'
Jan 26 08:20:17 compute-1 sudo[47022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:17 compute-1 python3.9[47024]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769415616.6885383-224-219898643257368/.source.conf follow=False _original_basename=registries.conf.j2 checksum=afa1df2f20df99cadae6785e2dec481dcc7ded84 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:20:17 compute-1 sudo[47022]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:18 compute-1 sudo[47174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffrvpqunorhqkfoxftsjinztkpznbdfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415618.1667883-256-28110795921959/AnsiballZ_ini_file.py'
Jan 26 08:20:18 compute-1 sudo[47174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:18 compute-1 python3.9[47176]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:20:18 compute-1 sudo[47174]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:19 compute-1 sudo[47326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eafydtuycgcdotfkgvhuwnvuarswsoeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415619.060123-256-28165480288946/AnsiballZ_ini_file.py'
Jan 26 08:20:19 compute-1 sudo[47326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:19 compute-1 python3.9[47328]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:20:19 compute-1 sudo[47326]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:20 compute-1 sudo[47478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikpiefaoyppdvwblocxehnczmhadsuzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415619.844345-256-271566525637585/AnsiballZ_ini_file.py'
Jan 26 08:20:20 compute-1 sudo[47478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:20 compute-1 python3.9[47480]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:20:20 compute-1 sudo[47478]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:21 compute-1 sudo[47630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmvhdoahpzjpwrwzoxomagbsldnatbek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415620.7622795-256-106420912889784/AnsiballZ_ini_file.py'
Jan 26 08:20:21 compute-1 sudo[47630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:21 compute-1 python3.9[47632]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:20:21 compute-1 sudo[47630]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:22 compute-1 python3.9[47782]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:20:22 compute-1 sudo[47934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyhqkwlusobfskpzvgmcsuqkmxeexgwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415622.6382027-336-61913681518949/AnsiballZ_dnf.py'
Jan 26 08:20:22 compute-1 sudo[47934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:23 compute-1 python3.9[47936]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 08:20:24 compute-1 sudo[47934]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:25 compute-1 sudo[48087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svyempmogugpdqckzpofdbyfszqjvpwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415624.801285-352-227584902597221/AnsiballZ_dnf.py'
Jan 26 08:20:25 compute-1 sudo[48087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:25 compute-1 python3.9[48089]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 08:20:27 compute-1 sudo[48087]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:28 compute-1 sudo[48247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aifswivfpqitwdbbqcuhcilfkceszkbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415628.2944052-372-78233288901047/AnsiballZ_dnf.py'
Jan 26 08:20:28 compute-1 sudo[48247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:28 compute-1 python3.9[48249]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 08:20:30 compute-1 sudo[48247]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:31 compute-1 sudo[48400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vncjtmdkfkomfboangosoapscsqeyopz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415630.6995528-390-9542590405686/AnsiballZ_dnf.py'
Jan 26 08:20:31 compute-1 sudo[48400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:31 compute-1 python3.9[48402]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 08:20:32 compute-1 sudo[48400]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:33 compute-1 sudo[48554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjfgxizyuthojoeupeowvyrspzpsjelo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415632.8824663-412-175907531531424/AnsiballZ_dnf.py'
Jan 26 08:20:33 compute-1 sudo[48554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:33 compute-1 python3.9[48556]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 08:20:35 compute-1 sudo[48554]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:35 compute-1 sudo[48710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jegzbrzzgmahnqcfkkkrzabzekvuuzue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415635.52444-428-241850813551690/AnsiballZ_dnf.py'
Jan 26 08:20:35 compute-1 sudo[48710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:35 compute-1 python3.9[48712]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 08:20:38 compute-1 sudo[48710]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:39 compute-1 sudo[48880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaolfsicqysaleqhmvccpccelcqcqusc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415639.2556639-446-273533561681424/AnsiballZ_dnf.py'
Jan 26 08:20:39 compute-1 sudo[48880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:39 compute-1 python3.9[48882]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 08:20:41 compute-1 sudo[48880]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:41 compute-1 sudo[49033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtjvmczyilgpuebwknmmfbowukxaccwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415641.509279-464-280722282548127/AnsiballZ_dnf.py'
Jan 26 08:20:41 compute-1 sudo[49033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:42 compute-1 python3.9[49035]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 08:20:55 compute-1 sudo[49033]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:56 compute-1 sudo[49368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-komidyxjljrrjpwhzyqdxxnkvxkvexzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415655.9252777-482-31476805726441/AnsiballZ_dnf.py'
Jan 26 08:20:56 compute-1 sudo[49368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:56 compute-1 python3.9[49370]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 08:20:57 compute-1 sudo[49368]: pam_unix(sudo:session): session closed for user root
Jan 26 08:20:58 compute-1 sudo[49524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phtzqrhxibmnjkzvyjoofkvdeykllafs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415658.247205-502-185880629406285/AnsiballZ_dnf.py'
Jan 26 08:20:58 compute-1 sudo[49524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:20:58 compute-1 python3.9[49526]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 08:21:00 compute-1 sudo[49524]: pam_unix(sudo:session): session closed for user root
Jan 26 08:21:01 compute-1 sudo[49681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjryxmwhnxtmftsdycefnoeebjxkxafr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415660.976116-524-224333753794908/AnsiballZ_file.py'
Jan 26 08:21:01 compute-1 sudo[49681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:21:01 compute-1 python3.9[49683]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:21:01 compute-1 sudo[49681]: pam_unix(sudo:session): session closed for user root
Jan 26 08:21:02 compute-1 sudo[49856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpgpyombbzkulgeconigshwvrndhmnpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415661.769927-540-123696906089985/AnsiballZ_stat.py'
Jan 26 08:21:02 compute-1 sudo[49856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:21:02 compute-1 python3.9[49858]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:21:02 compute-1 sudo[49856]: pam_unix(sudo:session): session closed for user root
Jan 26 08:21:02 compute-1 sudo[49979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmyuuspsxqagzhbgxzndvaynyqugwwhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415661.769927-540-123696906089985/AnsiballZ_copy.py'
Jan 26 08:21:02 compute-1 sudo[49979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:21:02 compute-1 python3.9[49981]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769415661.769927-540-123696906089985/.source.json _original_basename=.qf00wspf follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:21:02 compute-1 sudo[49979]: pam_unix(sudo:session): session closed for user root
Jan 26 08:21:03 compute-1 sudo[50131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hthflhsoooikuksfgubasioitfrnulsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415663.308885-576-248516819479309/AnsiballZ_podman_image.py'
Jan 26 08:21:03 compute-1 sudo[50131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:21:04 compute-1 python3.9[50133]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 26 08:21:04 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:06 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat3602731034-lower\x2dmapped.mount: Deactivated successfully.
Jan 26 08:21:11 compute-1 podman[50145]: 2026-01-26 08:21:11.154479699 +0000 UTC m=+6.928169392 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 26 08:21:11 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:11 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:11 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:11 compute-1 sudo[50131]: pam_unix(sudo:session): session closed for user root
Jan 26 08:21:12 compute-1 sudo[50443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfnufmbwywimayffdqpkrfkygdhutqze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415672.0607805-598-10814012831304/AnsiballZ_podman_image.py'
Jan 26 08:21:12 compute-1 sudo[50443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:21:12 compute-1 python3.9[50445]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 26 08:21:12 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:21 compute-1 podman[50458]: 2026-01-26 08:21:21.484432564 +0000 UTC m=+8.814094356 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 08:21:21 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:21 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:21 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:21 compute-1 sudo[50443]: pam_unix(sudo:session): session closed for user root
Jan 26 08:21:22 compute-1 sudo[50752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjhvcjdhcmsdwobdmpysjfrtxshytzfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415682.5062518-618-90916217201965/AnsiballZ_podman_image.py'
Jan 26 08:21:22 compute-1 sudo[50752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:21:23 compute-1 python3.9[50754]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 26 08:21:23 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:34 compute-1 podman[50767]: 2026-01-26 08:21:34.694554951 +0000 UTC m=+11.577602613 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 26 08:21:34 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:34 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:34 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:35 compute-1 sudo[50752]: pam_unix(sudo:session): session closed for user root
Jan 26 08:21:38 compute-1 sudo[51028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iucknljnopekrdkfxikweubnolctemyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415698.6695924-640-143987113856265/AnsiballZ_podman_image.py'
Jan 26 08:21:38 compute-1 sudo[51028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:21:39 compute-1 python3.9[51030]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 26 08:21:39 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:43 compute-1 podman[51042]: 2026-01-26 08:21:43.178620037 +0000 UTC m=+3.874768090 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 26 08:21:43 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:43 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:43 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:43 compute-1 sudo[51028]: pam_unix(sudo:session): session closed for user root
Jan 26 08:21:44 compute-1 sudo[51295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnztkndgnhibzyongxscmcodjogjlxlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415703.723069-640-126541965831384/AnsiballZ_podman_image.py'
Jan 26 08:21:44 compute-1 sudo[51295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:21:44 compute-1 python3.9[51297]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 26 08:21:44 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:45 compute-1 podman[51311]: 2026-01-26 08:21:45.486757864 +0000 UTC m=+1.210895077 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 26 08:21:45 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:45 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:45 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:21:45 compute-1 sudo[51295]: pam_unix(sudo:session): session closed for user root
Jan 26 08:21:46 compute-1 sshd-session[45136]: Connection closed by 192.168.122.30 port 57326
Jan 26 08:21:46 compute-1 sshd-session[45133]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:21:46 compute-1 systemd[1]: session-12.scope: Deactivated successfully.
Jan 26 08:21:46 compute-1 systemd[1]: session-12.scope: Consumed 2min 3.788s CPU time.
Jan 26 08:21:46 compute-1 systemd-logind[788]: Session 12 logged out. Waiting for processes to exit.
Jan 26 08:21:46 compute-1 systemd-logind[788]: Removed session 12.
Jan 26 08:21:51 compute-1 sshd-session[51458]: Accepted publickey for zuul from 192.168.122.30 port 50448 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:21:51 compute-1 systemd-logind[788]: New session 13 of user zuul.
Jan 26 08:21:51 compute-1 systemd[1]: Started Session 13 of User zuul.
Jan 26 08:21:51 compute-1 sshd-session[51458]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:21:52 compute-1 python3.9[51611]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:21:53 compute-1 sudo[51765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkfjbqullymgfstrvdcvjabuwjmuvnif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415713.3331666-48-91701006362155/AnsiballZ_getent.py'
Jan 26 08:21:53 compute-1 sudo[51765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:21:54 compute-1 python3.9[51767]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 26 08:21:54 compute-1 sudo[51765]: pam_unix(sudo:session): session closed for user root
Jan 26 08:21:54 compute-1 sudo[51918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keuitkiunaqvwhfiqqamerqwgcsicakz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415714.289714-64-275631708192760/AnsiballZ_group.py'
Jan 26 08:21:54 compute-1 sudo[51918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:21:54 compute-1 python3.9[51920]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 08:21:54 compute-1 groupadd[51921]: group added to /etc/group: name=openvswitch, GID=42476
Jan 26 08:21:54 compute-1 groupadd[51921]: group added to /etc/gshadow: name=openvswitch
Jan 26 08:21:54 compute-1 groupadd[51921]: new group: name=openvswitch, GID=42476
Jan 26 08:21:55 compute-1 sudo[51918]: pam_unix(sudo:session): session closed for user root
Jan 26 08:21:55 compute-1 sudo[52076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adzintimujwopraupzeglxickyfyxkfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415715.2720313-80-64832738903396/AnsiballZ_user.py'
Jan 26 08:21:55 compute-1 sudo[52076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:21:56 compute-1 python3.9[52078]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 08:21:56 compute-1 useradd[52080]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Jan 26 08:21:56 compute-1 useradd[52080]: add 'openvswitch' to group 'hugetlbfs'
Jan 26 08:21:56 compute-1 useradd[52080]: add 'openvswitch' to shadow group 'hugetlbfs'
Jan 26 08:21:56 compute-1 sudo[52076]: pam_unix(sudo:session): session closed for user root
Jan 26 08:21:56 compute-1 sudo[52236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbfmohvjxbkvkoealipmbpsdmnantytq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415716.571627-100-86342414129789/AnsiballZ_setup.py'
Jan 26 08:21:56 compute-1 sudo[52236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:21:57 compute-1 python3.9[52238]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 08:21:57 compute-1 sudo[52236]: pam_unix(sudo:session): session closed for user root
Jan 26 08:21:57 compute-1 sudo[52320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjquihydotaxdpzdjluvxcdkeaxqevxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415716.571627-100-86342414129789/AnsiballZ_dnf.py'
Jan 26 08:21:57 compute-1 sudo[52320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:21:58 compute-1 python3.9[52322]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 08:22:00 compute-1 sudo[52320]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:01 compute-1 sudo[52482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfdepgddsgqdgjiiixmdvrkogytusgli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415720.614274-128-94019011340025/AnsiballZ_dnf.py'
Jan 26 08:22:01 compute-1 sudo[52482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:01 compute-1 python3.9[52484]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 08:22:13 compute-1 kernel: SELinux:  Converting 2737 SID table entries...
Jan 26 08:22:13 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 08:22:13 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 26 08:22:13 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 08:22:13 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 26 08:22:13 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 08:22:13 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 08:22:13 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 08:22:13 compute-1 groupadd[52507]: group added to /etc/group: name=unbound, GID=994
Jan 26 08:22:13 compute-1 groupadd[52507]: group added to /etc/gshadow: name=unbound
Jan 26 08:22:13 compute-1 groupadd[52507]: new group: name=unbound, GID=994
Jan 26 08:22:13 compute-1 useradd[52514]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Jan 26 08:22:13 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 26 08:22:13 compute-1 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 26 08:22:15 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 08:22:15 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 08:22:16 compute-1 systemd[1]: Reloading.
Jan 26 08:22:16 compute-1 systemd-rc-local-generator[53009]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:22:16 compute-1 systemd-sysv-generator[53015]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:22:16 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 08:22:16 compute-1 sudo[52482]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:17 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 08:22:17 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 08:22:17 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.111s CPU time.
Jan 26 08:22:17 compute-1 systemd[1]: run-re7730fc2fe5d48bfa99316f74b2b51b7.service: Deactivated successfully.
Jan 26 08:22:19 compute-1 sudo[53579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgqvizludfmecjvmjknlmtucmmnfzald ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415738.6627786-144-185684916816922/AnsiballZ_systemd.py'
Jan 26 08:22:19 compute-1 sudo[53579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:19 compute-1 python3.9[53581]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 08:22:19 compute-1 systemd[1]: Reloading.
Jan 26 08:22:19 compute-1 systemd-rc-local-generator[53612]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:22:19 compute-1 systemd-sysv-generator[53616]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:22:20 compute-1 systemd[1]: Starting Open vSwitch Database Unit...
Jan 26 08:22:20 compute-1 chown[53623]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 26 08:22:20 compute-1 ovs-ctl[53628]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 26 08:22:20 compute-1 ovs-ctl[53628]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 26 08:22:20 compute-1 ovs-ctl[53628]: Starting ovsdb-server [  OK  ]
Jan 26 08:22:20 compute-1 ovs-vsctl[53677]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 26 08:22:20 compute-1 ovs-vsctl[53693]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"2f671d48-fb23-4421-893d-f2ec1411c819\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 26 08:22:20 compute-1 ovs-ctl[53628]: Configuring Open vSwitch system IDs [  OK  ]
Jan 26 08:22:20 compute-1 ovs-vsctl[53702]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 26 08:22:20 compute-1 ovs-ctl[53628]: Enabling remote OVSDB managers [  OK  ]
Jan 26 08:22:20 compute-1 systemd[1]: Started Open vSwitch Database Unit.
Jan 26 08:22:20 compute-1 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 26 08:22:20 compute-1 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 26 08:22:20 compute-1 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 26 08:22:20 compute-1 kernel: openvswitch: Open vSwitch switching datapath
Jan 26 08:22:20 compute-1 ovs-ctl[53747]: Inserting openvswitch module [  OK  ]
Jan 26 08:22:20 compute-1 ovs-ctl[53716]: Starting ovs-vswitchd [  OK  ]
Jan 26 08:22:20 compute-1 ovs-vsctl[53764]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 26 08:22:20 compute-1 ovs-ctl[53716]: Enabling remote OVSDB managers [  OK  ]
Jan 26 08:22:20 compute-1 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 26 08:22:20 compute-1 systemd[1]: Starting Open vSwitch...
Jan 26 08:22:20 compute-1 systemd[1]: Finished Open vSwitch.
Jan 26 08:22:21 compute-1 sudo[53579]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:22 compute-1 python3.9[53916]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:22:22 compute-1 sudo[54066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alqntkwnuifxojhpvjbsbxvqblpbevao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415742.3137534-180-179343930161833/AnsiballZ_sefcontext.py'
Jan 26 08:22:22 compute-1 sudo[54066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:23 compute-1 python3.9[54068]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 26 08:22:24 compute-1 kernel: SELinux:  Converting 2751 SID table entries...
Jan 26 08:22:24 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 08:22:24 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 26 08:22:24 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 08:22:24 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 26 08:22:24 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 08:22:24 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 08:22:24 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 08:22:24 compute-1 sudo[54066]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:25 compute-1 python3.9[54223]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:22:26 compute-1 sudo[54379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbnaeuddnkxmjtgydyvirlkqjqcdnpuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415746.2367892-216-175703288198549/AnsiballZ_dnf.py'
Jan 26 08:22:26 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 26 08:22:26 compute-1 sudo[54379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:26 compute-1 python3.9[54381]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 08:22:28 compute-1 sudo[54379]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:28 compute-1 sudo[54532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hatlubttciltsfkepzjydnuxyubnzzma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415748.333805-232-278492754831413/AnsiballZ_command.py'
Jan 26 08:22:28 compute-1 sudo[54532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:29 compute-1 python3.9[54534]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:22:29 compute-1 sudo[54532]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:30 compute-1 sudo[54819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuhinkygnkfcwkwvuehlimdsilgkbdqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415750.0398982-248-150191030386249/AnsiballZ_file.py'
Jan 26 08:22:30 compute-1 sudo[54819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:30 compute-1 python3.9[54821]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 26 08:22:30 compute-1 sudo[54819]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:31 compute-1 python3.9[54971]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:22:32 compute-1 sudo[55123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zopfnasykpoaknddeflmrbvudjhfevgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415751.8565915-280-234291036662408/AnsiballZ_dnf.py'
Jan 26 08:22:32 compute-1 sudo[55123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:32 compute-1 python3.9[55125]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 08:22:34 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 08:22:34 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 08:22:34 compute-1 systemd[1]: Reloading.
Jan 26 08:22:34 compute-1 systemd-rc-local-generator[55165]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:22:34 compute-1 systemd-sysv-generator[55168]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:22:34 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 08:22:35 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 08:22:35 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 08:22:35 compute-1 systemd[1]: run-re087e91ae5e44553bcbb0e3549535c1c.service: Deactivated successfully.
Jan 26 08:22:35 compute-1 sudo[55123]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:35 compute-1 sudo[55440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxtapupboiqdgenstqrebvgcpozoizkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415755.4080381-296-7782475739841/AnsiballZ_systemd.py'
Jan 26 08:22:35 compute-1 sudo[55440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:36 compute-1 python3.9[55442]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 08:22:36 compute-1 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 26 08:22:36 compute-1 systemd[1]: Stopped Network Manager Wait Online.
Jan 26 08:22:36 compute-1 systemd[1]: Stopping Network Manager Wait Online...
Jan 26 08:22:36 compute-1 systemd[1]: Stopping Network Manager...
Jan 26 08:22:36 compute-1 NetworkManager[7201]: <info>  [1769415756.0718] caught SIGTERM, shutting down normally.
Jan 26 08:22:36 compute-1 NetworkManager[7201]: <info>  [1769415756.0739] dhcp4 (eth0): canceled DHCP transaction
Jan 26 08:22:36 compute-1 NetworkManager[7201]: <info>  [1769415756.0739] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 08:22:36 compute-1 NetworkManager[7201]: <info>  [1769415756.0739] dhcp4 (eth0): state changed no lease
Jan 26 08:22:36 compute-1 NetworkManager[7201]: <info>  [1769415756.0742] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 08:22:36 compute-1 NetworkManager[7201]: <info>  [1769415756.0838] exiting (success)
Jan 26 08:22:36 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 08:22:36 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 08:22:36 compute-1 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 26 08:22:36 compute-1 systemd[1]: Stopped Network Manager.
Jan 26 08:22:36 compute-1 systemd[1]: NetworkManager.service: Consumed 12.012s CPU time, 4.3M memory peak, read 0B from disk, written 40.5K to disk.
Jan 26 08:22:36 compute-1 systemd[1]: Starting Network Manager...
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.1597] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:ac323890-fec5-4361-852a-4b7b8dc1d6fe)
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.1602] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.1672] manager[0x55e9c5eea000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 26 08:22:36 compute-1 systemd[1]: Starting Hostname Service...
Jan 26 08:22:36 compute-1 systemd[1]: Started Hostname Service.
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2784] hostname: hostname: using hostnamed
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2785] hostname: static hostname changed from (none) to "compute-1"
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2793] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2802] manager[0x55e9c5eea000]: rfkill: Wi-Fi hardware radio set enabled
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2802] manager[0x55e9c5eea000]: rfkill: WWAN hardware radio set enabled
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2841] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2858] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2859] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2861] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2862] manager: Networking is enabled by state file
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2866] settings: Loaded settings plugin: keyfile (internal)
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2871] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2917] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2933] dhcp: init: Using DHCP client 'internal'
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2938] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2949] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2961] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2975] device (lo): Activation: starting connection 'lo' (b8f0ada8-b516-476a-9a84-27149369f44b)
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2986] device (eth0): carrier: link connected
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.2993] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3004] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3005] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3018] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3029] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3041] device (eth1): carrier: link connected
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3050] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3061] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (5afdbf09-b387-55a7-bf0b-677a9446200b) (indicated)
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3062] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3071] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3084] device (eth1): Activation: starting connection 'ci-private-network' (5afdbf09-b387-55a7-bf0b-677a9446200b)
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3092] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 26 08:22:36 compute-1 systemd[1]: Started Network Manager.
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3106] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3132] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3136] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3141] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3147] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3152] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3157] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3165] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3176] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3182] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3197] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3224] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 08:22:36 compute-1 systemd[1]: Starting Network Manager Wait Online...
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3238] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3243] dhcp4 (eth0): state changed new lease, address=38.102.83.74
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3248] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3256] device (lo): Activation: successful, device activated.
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3271] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3368] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3379] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3383] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3388] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3393] device (eth1): Activation: successful, device activated.
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3447] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3450] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3457] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3461] device (eth0): Activation: successful, device activated.
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3470] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 26 08:22:36 compute-1 NetworkManager[55451]: <info>  [1769415756.3475] manager: startup complete
Jan 26 08:22:36 compute-1 systemd[1]: Finished Network Manager Wait Online.
Jan 26 08:22:36 compute-1 sudo[55440]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:36 compute-1 sudo[55666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbmpiprvmeykrsxjmhyzhrluauuarqmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415756.5870955-312-280077362608578/AnsiballZ_dnf.py'
Jan 26 08:22:36 compute-1 sudo[55666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:37 compute-1 python3.9[55668]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 08:22:42 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 08:22:42 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 08:22:42 compute-1 systemd[1]: Reloading.
Jan 26 08:22:42 compute-1 systemd-sysv-generator[55726]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:22:42 compute-1 systemd-rc-local-generator[55719]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:22:42 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 08:22:43 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 08:22:43 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 08:22:43 compute-1 systemd[1]: run-r4f2297d0ee4b4abc9b23b7cc7b5f4862.service: Deactivated successfully.
Jan 26 08:22:44 compute-1 sudo[55666]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:45 compute-1 sudo[56126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptrfydhnnzvkcskbvfaodlqqiljawakv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415764.5530412-336-6354826138582/AnsiballZ_stat.py'
Jan 26 08:22:45 compute-1 sudo[56126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:45 compute-1 python3.9[56128]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:22:45 compute-1 sudo[56126]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:46 compute-1 sudo[56278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdhmxyqkbmmlpxogfejjlgryugirzsnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415765.585388-354-192692732075559/AnsiballZ_ini_file.py'
Jan 26 08:22:46 compute-1 sudo[56278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:46 compute-1 python3.9[56280]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:22:46 compute-1 sudo[56278]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:46 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 08:22:47 compute-1 sudo[56432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qailuxixwdkllbfbqcfojxdgdwybfbkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415766.7033696-374-12800129893869/AnsiballZ_ini_file.py'
Jan 26 08:22:47 compute-1 sudo[56432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:47 compute-1 python3.9[56434]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:22:47 compute-1 sudo[56432]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:47 compute-1 sudo[56584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsbyagtqxewiajrhhaxcssnpudvuiggo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415767.5148168-374-251813810141230/AnsiballZ_ini_file.py'
Jan 26 08:22:47 compute-1 sudo[56584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:48 compute-1 python3.9[56586]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:22:48 compute-1 sudo[56584]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:48 compute-1 sudo[56736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nreyyqstmoqtdfaxqlujnxcnxsuatcww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415768.2935174-404-237686892560609/AnsiballZ_ini_file.py'
Jan 26 08:22:48 compute-1 sudo[56736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:48 compute-1 python3.9[56738]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:22:48 compute-1 sudo[56736]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:49 compute-1 sudo[56888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbydvrdyqwfrdjrvbienlzrkpmiogtfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415769.3640158-404-184191445236304/AnsiballZ_ini_file.py'
Jan 26 08:22:49 compute-1 sudo[56888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:49 compute-1 python3.9[56890]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:22:49 compute-1 sudo[56888]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:50 compute-1 sudo[57040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhxukwrotzlqvqzyhlrmzxwdrekuqujy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415770.3644145-434-107614589080552/AnsiballZ_stat.py'
Jan 26 08:22:50 compute-1 sudo[57040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:50 compute-1 python3.9[57042]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:22:50 compute-1 sudo[57040]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:51 compute-1 sudo[57163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvsdvqbuxcqenomufezzmcsqdhuyfvnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415770.3644145-434-107614589080552/AnsiballZ_copy.py'
Jan 26 08:22:51 compute-1 sudo[57163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:51 compute-1 python3.9[57165]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769415770.3644145-434-107614589080552/.source _original_basename=.4_kvka98 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:22:51 compute-1 sudo[57163]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:52 compute-1 sudo[57315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yefzioobrkuuukqucevwsbopnqrtybsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415771.7733252-464-208815071842275/AnsiballZ_file.py'
Jan 26 08:22:52 compute-1 sudo[57315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:52 compute-1 python3.9[57317]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:22:52 compute-1 sudo[57315]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:52 compute-1 sudo[57467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fefzfsbvcdbgfdmgnhhwugvsulguhuze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415772.5138264-480-260399163339707/AnsiballZ_edpm_os_net_config_mappings.py'
Jan 26 08:22:52 compute-1 sudo[57467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:53 compute-1 python3.9[57469]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 26 08:22:53 compute-1 sudo[57467]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:53 compute-1 sudo[57619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcmaflsyynrbbkkdpexrfdbjhelplpxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415773.4624352-498-81258197774727/AnsiballZ_file.py'
Jan 26 08:22:53 compute-1 sudo[57619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:54 compute-1 python3.9[57621]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:22:54 compute-1 sudo[57619]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:54 compute-1 sudo[57771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swbyzirekjajxmjgtpdojfivijptkwkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415774.5230637-518-235128409023919/AnsiballZ_stat.py'
Jan 26 08:22:54 compute-1 sudo[57771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:55 compute-1 sudo[57771]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:55 compute-1 sudo[57894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhqmzhgqoyxwvksxlneocymkishesztf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415774.5230637-518-235128409023919/AnsiballZ_copy.py'
Jan 26 08:22:55 compute-1 sudo[57894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:55 compute-1 sudo[57894]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:56 compute-1 sudo[58046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpdympjutrmqbrlwglxhizohkpfvamqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415776.0573354-548-129929059600737/AnsiballZ_slurp.py'
Jan 26 08:22:56 compute-1 sudo[58046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:56 compute-1 python3.9[58048]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 26 08:22:56 compute-1 sudo[58046]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:57 compute-1 sudo[58221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmcnrvoeypopixfwrveeleevrvxjbfoc ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415777.0477002-566-18618359559957/async_wrapper.py j296804161410 300 /home/zuul/.ansible/tmp/ansible-tmp-1769415777.0477002-566-18618359559957/AnsiballZ_edpm_os_net_config.py _'
Jan 26 08:22:57 compute-1 sudo[58221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:22:57 compute-1 ansible-async_wrapper.py[58223]: Invoked with j296804161410 300 /home/zuul/.ansible/tmp/ansible-tmp-1769415777.0477002-566-18618359559957/AnsiballZ_edpm_os_net_config.py _
Jan 26 08:22:57 compute-1 ansible-async_wrapper.py[58226]: Starting module and watcher
Jan 26 08:22:57 compute-1 ansible-async_wrapper.py[58226]: Start watching 58227 (300)
Jan 26 08:22:57 compute-1 ansible-async_wrapper.py[58227]: Start module (58227)
Jan 26 08:22:57 compute-1 ansible-async_wrapper.py[58223]: Return async_wrapper task started.
Jan 26 08:22:57 compute-1 sudo[58221]: pam_unix(sudo:session): session closed for user root
Jan 26 08:22:58 compute-1 python3.9[58228]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 26 08:22:58 compute-1 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 26 08:22:58 compute-1 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 26 08:22:58 compute-1 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 26 08:22:58 compute-1 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 26 08:22:58 compute-1 kernel: cfg80211: failed to load regulatory.db
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.4166] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.4192] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5038] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5040] audit: op="connection-add" uuid="d1323d6b-5492-4b02-838e-135d31768453" name="br-ex-br" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5066] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5068] audit: op="connection-add" uuid="3f955424-47c1-47a6-86c5-83400255d201" name="br-ex-port" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5087] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5090] audit: op="connection-add" uuid="9acbf2ed-0636-4478-b6d4-2775f9c2a5e4" name="eth1-port" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5109] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5113] audit: op="connection-add" uuid="ef41c93a-73e6-4710-bcd3-1359ed30d03a" name="vlan20-port" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5132] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5135] audit: op="connection-add" uuid="d5646939-5037-469b-b1c2-55139f92909a" name="vlan21-port" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5154] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5157] audit: op="connection-add" uuid="218c463a-8788-405c-967a-c29a8c41c258" name="vlan22-port" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5194] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,connection.timestamp,connection.autoconnect-priority,ipv6.method,ipv6.addr-gen-mode,ipv6.dhcp-timeout" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5225] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5228] audit: op="connection-add" uuid="c12d5265-bc8d-43c8-941c-4bc870de8d11" name="br-ex-if" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5538] audit: op="connection-update" uuid="5afdbf09-b387-55a7-bf0b-677a9446200b" name="ci-private-network" args="ipv4.method,ipv4.never-default,ipv4.addresses,ipv4.dns,ipv4.routes,ipv4.routing-rules,ovs-interface.type,ovs-external-ids.data,connection.slave-type,connection.timestamp,connection.master,connection.port-type,connection.controller,ipv6.method,ipv6.routes,ipv6.addresses,ipv6.addr-gen-mode,ipv6.dns,ipv6.routing-rules" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5571] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5576] audit: op="connection-add" uuid="13d059e4-fa5e-421b-91fa-f37b655ab9dd" name="vlan20-if" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5605] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5609] audit: op="connection-add" uuid="aed04422-bbb9-4bf2-ba9c-5878f69b613d" name="vlan21-if" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5641] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5646] audit: op="connection-add" uuid="ad3767b3-2d96-4be3-897f-2c1e37a38e46" name="vlan22-if" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5670] audit: op="connection-delete" uuid="b992380d-d503-34bf-95a3-75c144beb661" name="Wired connection 1" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5697] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <warn>  [1769415780.5701] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5712] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5718] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (d1323d6b-5492-4b02-838e-135d31768453)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5719] audit: op="connection-activate" uuid="d1323d6b-5492-4b02-838e-135d31768453" name="br-ex-br" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5723] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <warn>  [1769415780.5725] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5731] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5736] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (3f955424-47c1-47a6-86c5-83400255d201)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5741] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <warn>  [1769415780.5743] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5748] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5754] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (9acbf2ed-0636-4478-b6d4-2775f9c2a5e4)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5757] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <warn>  [1769415780.5758] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5765] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5772] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (ef41c93a-73e6-4710-bcd3-1359ed30d03a)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5776] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <warn>  [1769415780.5778] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5784] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5790] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (d5646939-5037-469b-b1c2-55139f92909a)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5797] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <warn>  [1769415780.5799] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5807] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5812] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (218c463a-8788-405c-967a-c29a8c41c258)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5812] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5816] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5817] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5825] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <warn>  [1769415780.5826] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5829] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5833] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (c12d5265-bc8d-43c8-941c-4bc870de8d11)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5834] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5838] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5840] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5841] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5843] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5856] device (eth1): disconnecting for new activation request.
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5857] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5860] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5862] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5862] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5866] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <warn>  [1769415780.5867] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5870] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5874] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (13d059e4-fa5e-421b-91fa-f37b655ab9dd)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5875] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5878] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5879] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5881] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5884] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <warn>  [1769415780.5885] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5888] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5893] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (aed04422-bbb9-4bf2-ba9c-5878f69b613d)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5894] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5897] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5898] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5900] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5903] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <warn>  [1769415780.5904] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5907] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5911] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (ad3767b3-2d96-4be3-897f-2c1e37a38e46)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5912] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5914] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5916] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5919] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5920] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5937] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,connection.autoconnect-priority,ipv6.method,ipv6.addr-gen-mode" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5940] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5943] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5945] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5953] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5957] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5960] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5973] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5977] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 kernel: ovs-system: entered promiscuous mode
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5990] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.5997] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6001] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6003] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6010] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6015] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6019] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 kernel: Timeout policy base is empty
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6022] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 systemd-udevd[58232]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6028] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6033] dhcp4 (eth0): canceled DHCP transaction
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6033] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6033] dhcp4 (eth0): state changed no lease
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6035] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6048] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6053] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58229 uid=0 result="fail" reason="Device is not activated"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6062] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6072] device (eth1): disconnecting for new activation request.
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6073] audit: op="connection-activate" uuid="5afdbf09-b387-55a7-bf0b-677a9446200b" name="ci-private-network" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6075] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6117] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6123] dhcp4 (eth0): state changed new lease, address=38.102.83.74
Jan 26 08:23:00 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6178] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58229 uid=0 result="success"
Jan 26 08:23:00 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6237] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6341] device (eth1): Activation: starting connection 'ci-private-network' (5afdbf09-b387-55a7-bf0b-677a9446200b)
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6348] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6358] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6362] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6370] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6375] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6380] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6382] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6383] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6385] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6386] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6408] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6417] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6422] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6426] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6431] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6435] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6439] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6444] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6450] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6456] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6461] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6467] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6475] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6517] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6519] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 kernel: br-ex: entered promiscuous mode
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6528] device (eth1): Activation: successful, device activated.
Jan 26 08:23:00 compute-1 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6739] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6757] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 kernel: vlan22: entered promiscuous mode
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6842] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6844] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.6849] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 08:23:00 compute-1 kernel: vlan21: entered promiscuous mode
Jan 26 08:23:00 compute-1 systemd-udevd[58234]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.7023] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 26 08:23:00 compute-1 kernel: vlan20: entered promiscuous mode
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.7042] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.7085] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.7088] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.7094] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.7154] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.7157] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.7183] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.7194] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.7211] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.7213] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.7220] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.7233] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.7235] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 08:23:00 compute-1 NetworkManager[55451]: <info>  [1769415780.7242] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 08:23:01 compute-1 sudo[58559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjigjkgalhywctddxlowulejzimhrowt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415781.1207879-566-120985489787458/AnsiballZ_async_status.py'
Jan 26 08:23:01 compute-1 sudo[58559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:01 compute-1 python3.9[58561]: ansible-ansible.legacy.async_status Invoked with jid=j296804161410.58223 mode=status _async_dir=/root/.ansible_async
Jan 26 08:23:01 compute-1 sudo[58559]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:01 compute-1 NetworkManager[55451]: <info>  [1769415781.8563] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58229 uid=0 result="success"
Jan 26 08:23:02 compute-1 NetworkManager[55451]: <info>  [1769415782.0696] checkpoint[0x55e9c5ec0950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 26 08:23:02 compute-1 NetworkManager[55451]: <info>  [1769415782.0698] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58229 uid=0 result="success"
Jan 26 08:23:02 compute-1 NetworkManager[55451]: <info>  [1769415782.5841] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58229 uid=0 result="success"
Jan 26 08:23:02 compute-1 NetworkManager[55451]: <info>  [1769415782.5864] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58229 uid=0 result="success"
Jan 26 08:23:02 compute-1 ansible-async_wrapper.py[58226]: 58227 still running (300)
Jan 26 08:23:02 compute-1 NetworkManager[55451]: <info>  [1769415782.9724] audit: op="networking-control" arg="global-dns-configuration" pid=58229 uid=0 result="success"
Jan 26 08:23:02 compute-1 NetworkManager[55451]: <info>  [1769415782.9766] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 26 08:23:02 compute-1 NetworkManager[55451]: <info>  [1769415782.9802] audit: op="networking-control" arg="global-dns-configuration" pid=58229 uid=0 result="success"
Jan 26 08:23:02 compute-1 NetworkManager[55451]: <info>  [1769415782.9846] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58229 uid=0 result="success"
Jan 26 08:23:03 compute-1 NetworkManager[55451]: <info>  [1769415783.2225] checkpoint[0x55e9c5ec0a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 26 08:23:03 compute-1 NetworkManager[55451]: <info>  [1769415783.2230] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58229 uid=0 result="success"
Jan 26 08:23:03 compute-1 ansible-async_wrapper.py[58227]: Module complete (58227)
Jan 26 08:23:05 compute-1 sudo[58665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwotapjteyimjnokjmsrpklyacovsffb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415781.1207879-566-120985489787458/AnsiballZ_async_status.py'
Jan 26 08:23:05 compute-1 sudo[58665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:05 compute-1 python3.9[58667]: ansible-ansible.legacy.async_status Invoked with jid=j296804161410.58223 mode=status _async_dir=/root/.ansible_async
Jan 26 08:23:05 compute-1 sudo[58665]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:05 compute-1 sudo[58765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kccfhebrsuxoxozsykatiyvpjagkgonl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415781.1207879-566-120985489787458/AnsiballZ_async_status.py'
Jan 26 08:23:05 compute-1 sudo[58765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:05 compute-1 python3.9[58767]: ansible-ansible.legacy.async_status Invoked with jid=j296804161410.58223 mode=cleanup _async_dir=/root/.ansible_async
Jan 26 08:23:05 compute-1 sudo[58765]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:06 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 08:23:06 compute-1 sudo[58919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frldxiusbcsjyswbtbcttcrvwklqlgqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415786.220274-620-219049659716795/AnsiballZ_stat.py'
Jan 26 08:23:06 compute-1 sudo[58919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:06 compute-1 python3.9[58921]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:23:06 compute-1 sudo[58919]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:07 compute-1 sudo[59042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeuyaaiiyeekbqzazozrwghotptyibcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415786.220274-620-219049659716795/AnsiballZ_copy.py'
Jan 26 08:23:07 compute-1 sudo[59042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:07 compute-1 python3.9[59044]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769415786.220274-620-219049659716795/.source.returncode _original_basename=.guew2uch follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:23:07 compute-1 sudo[59042]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:07 compute-1 ansible-async_wrapper.py[58226]: Done in kid B.
Jan 26 08:23:08 compute-1 sudo[59194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttrzoedzahlmjmayldnksuncladclzwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415787.966176-652-250759803553373/AnsiballZ_stat.py'
Jan 26 08:23:08 compute-1 sudo[59194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:08 compute-1 python3.9[59196]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:23:08 compute-1 sudo[59194]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:09 compute-1 sudo[59318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkrrvplhbvndshqwybeevwulhbgdhhay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415787.966176-652-250759803553373/AnsiballZ_copy.py'
Jan 26 08:23:09 compute-1 sudo[59318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:09 compute-1 python3.9[59320]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769415787.966176-652-250759803553373/.source.cfg _original_basename=.eh0p88t0 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:23:09 compute-1 sudo[59318]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:09 compute-1 sudo[59470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuqiuxnnvkqhyqcdzcgrvrdczgtcowfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415789.467608-682-205761863427247/AnsiballZ_systemd.py'
Jan 26 08:23:09 compute-1 sudo[59470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:10 compute-1 python3.9[59472]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 08:23:10 compute-1 systemd[1]: Reloading Network Manager...
Jan 26 08:23:10 compute-1 NetworkManager[55451]: <info>  [1769415790.2892] audit: op="reload" arg="0" pid=59476 uid=0 result="success"
Jan 26 08:23:10 compute-1 NetworkManager[55451]: <info>  [1769415790.2904] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 26 08:23:10 compute-1 systemd[1]: Reloaded Network Manager.
Jan 26 08:23:10 compute-1 sudo[59470]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:10 compute-1 sshd-session[51461]: Connection closed by 192.168.122.30 port 50448
Jan 26 08:23:10 compute-1 sshd-session[51458]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:23:10 compute-1 systemd[1]: session-13.scope: Deactivated successfully.
Jan 26 08:23:10 compute-1 systemd[1]: session-13.scope: Consumed 56.853s CPU time.
Jan 26 08:23:10 compute-1 systemd-logind[788]: Session 13 logged out. Waiting for processes to exit.
Jan 26 08:23:10 compute-1 systemd-logind[788]: Removed session 13.
Jan 26 08:23:16 compute-1 sshd-session[59507]: Accepted publickey for zuul from 192.168.122.30 port 48824 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:23:16 compute-1 systemd-logind[788]: New session 14 of user zuul.
Jan 26 08:23:16 compute-1 systemd[1]: Started Session 14 of User zuul.
Jan 26 08:23:16 compute-1 sshd-session[59507]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:23:17 compute-1 python3.9[59660]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:23:18 compute-1 python3.9[59815]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 08:23:20 compute-1 python3.9[60004]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:23:20 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 08:23:20 compute-1 sshd-session[59510]: Connection closed by 192.168.122.30 port 48824
Jan 26 08:23:20 compute-1 sshd-session[59507]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:23:20 compute-1 systemd[1]: session-14.scope: Deactivated successfully.
Jan 26 08:23:20 compute-1 systemd[1]: session-14.scope: Consumed 2.816s CPU time.
Jan 26 08:23:20 compute-1 systemd-logind[788]: Session 14 logged out. Waiting for processes to exit.
Jan 26 08:23:20 compute-1 systemd-logind[788]: Removed session 14.
Jan 26 08:23:26 compute-1 sshd-session[60034]: Accepted publickey for zuul from 192.168.122.30 port 51274 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:23:26 compute-1 systemd-logind[788]: New session 15 of user zuul.
Jan 26 08:23:26 compute-1 systemd[1]: Started Session 15 of User zuul.
Jan 26 08:23:26 compute-1 sshd-session[60034]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:23:27 compute-1 python3.9[60187]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:23:28 compute-1 python3.9[60342]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:23:29 compute-1 sudo[60496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzpmdxuhhdooyvfsfkgitvdetlsrdwnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415809.3016994-56-118395391336310/AnsiballZ_setup.py'
Jan 26 08:23:29 compute-1 sudo[60496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:29 compute-1 python3.9[60498]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 08:23:30 compute-1 sudo[60496]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:30 compute-1 sudo[60580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgxxgpaxjbqimmeiacuwbvtdqmrwajjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415809.3016994-56-118395391336310/AnsiballZ_dnf.py'
Jan 26 08:23:30 compute-1 sudo[60580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:30 compute-1 python3.9[60582]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 08:23:32 compute-1 sudo[60580]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:32 compute-1 sudo[60734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjtaptqjcqazirntxqkcneeiyowxylzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415812.3323424-80-141070515551071/AnsiballZ_setup.py'
Jan 26 08:23:32 compute-1 sudo[60734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:33 compute-1 python3.9[60736]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 08:23:33 compute-1 sudo[60734]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:34 compute-1 sudo[60925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztlgrtbzylyspmakrhuhrmcxmtrmthmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415813.6739287-102-128806686205976/AnsiballZ_file.py'
Jan 26 08:23:34 compute-1 sudo[60925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:34 compute-1 python3.9[60927]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:23:34 compute-1 sudo[60925]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:35 compute-1 sudo[61077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csuliuvqzkycmcdoykgilqusbqviuvoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415814.6602142-118-76594529860158/AnsiballZ_command.py'
Jan 26 08:23:35 compute-1 sudo[61077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:35 compute-1 python3.9[61079]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:23:35 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:23:35 compute-1 sudo[61077]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:36 compute-1 sudo[61240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rytxsfpghkpzcabmybtjwnuawezswdjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415815.6658573-134-40949963529137/AnsiballZ_stat.py'
Jan 26 08:23:36 compute-1 sudo[61240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:36 compute-1 python3.9[61242]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:23:36 compute-1 sudo[61240]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:36 compute-1 sudo[61318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iajzcxjpvwuarzglbcobvyszwidugbvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415815.6658573-134-40949963529137/AnsiballZ_file.py'
Jan 26 08:23:36 compute-1 sudo[61318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:36 compute-1 python3.9[61320]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:23:36 compute-1 sudo[61318]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:37 compute-1 sudo[61470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjzbbtvubcqdmtudrxdaztyiaunfoyph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415817.094078-158-36277589534277/AnsiballZ_stat.py'
Jan 26 08:23:37 compute-1 sudo[61470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:37 compute-1 python3.9[61472]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:23:37 compute-1 sudo[61470]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:38 compute-1 sudo[61548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxblniwyjlrznqptygqwgezezpvxpuab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415817.094078-158-36277589534277/AnsiballZ_file.py'
Jan 26 08:23:38 compute-1 sudo[61548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:38 compute-1 python3.9[61550]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:23:38 compute-1 sudo[61548]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:39 compute-1 sudo[61700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfpwqzooprkrdcuhnripcldpehjlsjao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415818.5309412-184-247042440024168/AnsiballZ_ini_file.py'
Jan 26 08:23:39 compute-1 sudo[61700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:39 compute-1 python3.9[61702]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:23:39 compute-1 sudo[61700]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:39 compute-1 sudo[61852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfueplgdmszsyenvqzxoiyurovhttclv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415819.483329-184-204684469301702/AnsiballZ_ini_file.py'
Jan 26 08:23:39 compute-1 sudo[61852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:40 compute-1 python3.9[61854]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:23:40 compute-1 sudo[61852]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:40 compute-1 sudo[62004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvkgnfleejbzubossqzmjtqgelwdlmgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415820.2138302-184-152199341639280/AnsiballZ_ini_file.py'
Jan 26 08:23:40 compute-1 sudo[62004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:40 compute-1 python3.9[62006]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:23:40 compute-1 sudo[62004]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:41 compute-1 sudo[62156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qarcjwygrabycagoxzacxfahoirpiivq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415820.903655-184-8169258748912/AnsiballZ_ini_file.py'
Jan 26 08:23:41 compute-1 sudo[62156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:41 compute-1 python3.9[62158]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:23:41 compute-1 sudo[62156]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:42 compute-1 sudo[62308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmxqoslrzfoxbyoedfmoptgvuwgkguzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415821.7672696-246-177394094471787/AnsiballZ_dnf.py'
Jan 26 08:23:42 compute-1 sudo[62308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:42 compute-1 python3.9[62310]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 08:23:43 compute-1 sudo[62308]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:45 compute-1 sudo[62461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccateqijebisvohcmapybqlsiwapbrfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415824.6800425-268-152428721984262/AnsiballZ_setup.py'
Jan 26 08:23:45 compute-1 sudo[62461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:45 compute-1 python3.9[62463]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:23:45 compute-1 sudo[62461]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:46 compute-1 sudo[62615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zukhezlyysboghnalfxabcuchdtwjjik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415825.7812643-284-17039350364623/AnsiballZ_stat.py'
Jan 26 08:23:46 compute-1 sudo[62615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:46 compute-1 python3.9[62617]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:23:46 compute-1 sudo[62615]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:47 compute-1 sudo[62767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efeammqqmghrnhcnznydrpzqkicxvjxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415826.6786191-302-132605186477760/AnsiballZ_stat.py'
Jan 26 08:23:47 compute-1 sudo[62767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:47 compute-1 python3.9[62769]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:23:47 compute-1 sudo[62767]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:47 compute-1 sudo[62919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utlivpnccqfduqyrdyatcfwyctplqwqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415827.6166515-322-238454153602452/AnsiballZ_command.py'
Jan 26 08:23:47 compute-1 sudo[62919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:48 compute-1 python3.9[62921]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:23:48 compute-1 sudo[62919]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:49 compute-1 sudo[63072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sregevpijnnhqlieldmhcwogitpikvnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415828.5552945-342-96498618746561/AnsiballZ_service_facts.py'
Jan 26 08:23:49 compute-1 sudo[63072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:49 compute-1 python3.9[63074]: ansible-service_facts Invoked
Jan 26 08:23:50 compute-1 network[63091]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 08:23:50 compute-1 network[63092]: 'network-scripts' will be removed from distribution in near future.
Jan 26 08:23:50 compute-1 network[63093]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 08:23:53 compute-1 sudo[63072]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:55 compute-1 sudo[63376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agtppxphcmjzztxfiyaqjmdwprkuzruh ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1769415834.705881-372-231371798544790/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1769415834.705881-372-231371798544790/args'
Jan 26 08:23:55 compute-1 sudo[63376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:55 compute-1 sudo[63376]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:56 compute-1 sudo[63543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dskpkiitfekdmaqegcwxapojvtpiyauf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415835.758916-394-201440195234129/AnsiballZ_dnf.py'
Jan 26 08:23:56 compute-1 sudo[63543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:56 compute-1 python3.9[63545]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 08:23:57 compute-1 sudo[63543]: pam_unix(sudo:session): session closed for user root
Jan 26 08:23:58 compute-1 sudo[63696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjhuhkkkzedwejyowcdkyrxspbuizlks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415838.0079813-420-59181638241538/AnsiballZ_package_facts.py'
Jan 26 08:23:58 compute-1 sudo[63696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:23:58 compute-1 python3.9[63698]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 26 08:23:59 compute-1 sudo[63696]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:00 compute-1 sudo[63848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxyagmqoobmabkdjhsswijlakvqdhbxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415839.7597718-440-139935396171360/AnsiballZ_stat.py'
Jan 26 08:24:00 compute-1 sudo[63848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:00 compute-1 python3.9[63850]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:24:00 compute-1 sudo[63848]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:01 compute-1 sudo[63973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahugdmvxuzulnuvwstmubfavyfslpxem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415839.7597718-440-139935396171360/AnsiballZ_copy.py'
Jan 26 08:24:01 compute-1 sudo[63973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:01 compute-1 python3.9[63975]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769415839.7597718-440-139935396171360/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:24:01 compute-1 sudo[63973]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:01 compute-1 sudo[64127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmzoczgvynputwfyewbwajsuhtspavfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415841.504246-470-143634265302710/AnsiballZ_stat.py'
Jan 26 08:24:01 compute-1 sudo[64127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:02 compute-1 python3.9[64129]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:24:02 compute-1 sudo[64127]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:02 compute-1 sudo[64252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmcdbsvwyqtzqjnnhdsdxijfuicykvsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415841.504246-470-143634265302710/AnsiballZ_copy.py'
Jan 26 08:24:02 compute-1 sudo[64252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:02 compute-1 python3.9[64254]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769415841.504246-470-143634265302710/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:24:02 compute-1 sudo[64252]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:03 compute-1 sudo[64406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsqoooteshrtyqtmoxjdogoaxohgyech ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415843.409962-512-236770826884545/AnsiballZ_lineinfile.py'
Jan 26 08:24:03 compute-1 sudo[64406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:04 compute-1 python3.9[64408]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:24:04 compute-1 sudo[64406]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:05 compute-1 sudo[64560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icbxbgzfwxvalicjjpxdnpwdfibiruoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415844.8586807-542-203270225209389/AnsiballZ_setup.py'
Jan 26 08:24:05 compute-1 sudo[64560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:05 compute-1 python3.9[64562]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 08:24:05 compute-1 sudo[64560]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:06 compute-1 sudo[64644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srawheoiwcuafqfilkqiznjmvnfykfjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415844.8586807-542-203270225209389/AnsiballZ_systemd.py'
Jan 26 08:24:06 compute-1 sudo[64644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:06 compute-1 python3.9[64646]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:24:06 compute-1 sudo[64644]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:07 compute-1 sudo[64798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exqtswwqbshkndjporzqwmoejxirarku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415847.321271-574-80664089861500/AnsiballZ_setup.py'
Jan 26 08:24:07 compute-1 sudo[64798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:08 compute-1 python3.9[64800]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 08:24:08 compute-1 sudo[64798]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:08 compute-1 sudo[64882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpthfhpevvwgaentquvicurxixtnmnjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415847.321271-574-80664089861500/AnsiballZ_systemd.py'
Jan 26 08:24:08 compute-1 sudo[64882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:08 compute-1 python3.9[64884]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 08:24:08 compute-1 chronyd[790]: chronyd exiting
Jan 26 08:24:08 compute-1 systemd[1]: Stopping NTP client/server...
Jan 26 08:24:08 compute-1 systemd[1]: chronyd.service: Deactivated successfully.
Jan 26 08:24:08 compute-1 systemd[1]: Stopped NTP client/server.
Jan 26 08:24:08 compute-1 systemd[1]: Starting NTP client/server...
Jan 26 08:24:09 compute-1 chronyd[64892]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 26 08:24:09 compute-1 chronyd[64892]: Frequency -27.284 +/- 18.389 ppm read from /var/lib/chrony/drift
Jan 26 08:24:09 compute-1 chronyd[64892]: Loaded seccomp filter (level 2)
Jan 26 08:24:09 compute-1 systemd[1]: Started NTP client/server.
Jan 26 08:24:09 compute-1 sudo[64882]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:10 compute-1 sshd-session[60037]: Connection closed by 192.168.122.30 port 51274
Jan 26 08:24:10 compute-1 sshd-session[60034]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:24:10 compute-1 systemd-logind[788]: Session 15 logged out. Waiting for processes to exit.
Jan 26 08:24:10 compute-1 systemd[1]: session-15.scope: Deactivated successfully.
Jan 26 08:24:10 compute-1 systemd[1]: session-15.scope: Consumed 30.860s CPU time.
Jan 26 08:24:10 compute-1 systemd-logind[788]: Removed session 15.
Jan 26 08:24:16 compute-1 sshd-session[64918]: Accepted publickey for zuul from 192.168.122.30 port 57836 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:24:16 compute-1 systemd-logind[788]: New session 16 of user zuul.
Jan 26 08:24:16 compute-1 systemd[1]: Started Session 16 of User zuul.
Jan 26 08:24:16 compute-1 sshd-session[64918]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:24:17 compute-1 python3.9[65071]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:24:18 compute-1 sudo[65225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mahvrgohivyqgsjjfzjhlijtftupoyrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415857.705421-42-248208673096520/AnsiballZ_file.py'
Jan 26 08:24:18 compute-1 sudo[65225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:18 compute-1 python3.9[65227]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:24:18 compute-1 sudo[65225]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:19 compute-1 sudo[65400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhfjhbrqtvmhjeidijslagyuzptpwczq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415858.630023-58-1072500854984/AnsiballZ_stat.py'
Jan 26 08:24:19 compute-1 sudo[65400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:19 compute-1 python3.9[65402]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:24:19 compute-1 sudo[65400]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:19 compute-1 sudo[65478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nenqfpuaqxwuhxoifybnvygqcqrnohie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415858.630023-58-1072500854984/AnsiballZ_file.py'
Jan 26 08:24:19 compute-1 sudo[65478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:19 compute-1 python3.9[65480]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.7ef_f1jj recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:24:19 compute-1 sudo[65478]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:20 compute-1 sudo[65630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlwzaujsorzjsqpwhhgstmylirmmqcml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415860.3830261-98-35889339468824/AnsiballZ_stat.py'
Jan 26 08:24:20 compute-1 sudo[65630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:20 compute-1 python3.9[65632]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:24:20 compute-1 sudo[65630]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:21 compute-1 sudo[65753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joqemhzgnsqvsblypfkezhmqccmtimgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415860.3830261-98-35889339468824/AnsiballZ_copy.py'
Jan 26 08:24:21 compute-1 sudo[65753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:21 compute-1 python3.9[65755]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769415860.3830261-98-35889339468824/.source _original_basename=.h3icatww follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:24:21 compute-1 sudo[65753]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:22 compute-1 sudo[65905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnovarhtndorzirtnjfvedfxiddwvoma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415861.9076338-130-213143420629559/AnsiballZ_file.py'
Jan 26 08:24:22 compute-1 sudo[65905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:22 compute-1 python3.9[65907]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:24:22 compute-1 sudo[65905]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:22 compute-1 sudo[66057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qufgrcggylahcbcwaebrinbvnxnahafq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415862.6663244-146-193570744632829/AnsiballZ_stat.py'
Jan 26 08:24:22 compute-1 sudo[66057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:23 compute-1 python3.9[66059]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:24:23 compute-1 sudo[66057]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:23 compute-1 sudo[66180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gapsbdytzrebhyiehvzmuolvjbhioypo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415862.6663244-146-193570744632829/AnsiballZ_copy.py'
Jan 26 08:24:23 compute-1 sudo[66180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:23 compute-1 python3.9[66182]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769415862.6663244-146-193570744632829/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:24:23 compute-1 sudo[66180]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:24 compute-1 sudo[66332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcjdmhfmqjpystamkckvvyghqqkcnxwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415864.012771-146-182576689872240/AnsiballZ_stat.py'
Jan 26 08:24:24 compute-1 sudo[66332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:24 compute-1 python3.9[66334]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:24:24 compute-1 sudo[66332]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:25 compute-1 sudo[66455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsfaebeabqxmmvtvzopugvdgvchrrlnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415864.012771-146-182576689872240/AnsiballZ_copy.py'
Jan 26 08:24:25 compute-1 sudo[66455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:25 compute-1 python3.9[66457]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769415864.012771-146-182576689872240/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:24:25 compute-1 sudo[66455]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:25 compute-1 sudo[66607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qanxfztpsovbkumqgqujqitsfabhjixs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415865.565199-204-221260584613603/AnsiballZ_file.py'
Jan 26 08:24:25 compute-1 sudo[66607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:26 compute-1 python3.9[66609]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:24:26 compute-1 sudo[66607]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:26 compute-1 sudo[66759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axusnrhfcydqucsuzftvqkdsmnxpwwcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415866.3656526-220-102507672189724/AnsiballZ_stat.py'
Jan 26 08:24:26 compute-1 sudo[66759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:26 compute-1 python3.9[66761]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:24:26 compute-1 sudo[66759]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:27 compute-1 sudo[66882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxubdoilsdxpsixqtmuicqokpcviksec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415866.3656526-220-102507672189724/AnsiballZ_copy.py'
Jan 26 08:24:27 compute-1 sudo[66882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:27 compute-1 python3.9[66884]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415866.3656526-220-102507672189724/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:24:27 compute-1 sudo[66882]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:28 compute-1 sudo[67034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfwfoufilmwdmxehnbyqiqizctlhpplm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415867.8482227-250-196331475481562/AnsiballZ_stat.py'
Jan 26 08:24:28 compute-1 sudo[67034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:28 compute-1 python3.9[67036]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:24:28 compute-1 sudo[67034]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:28 compute-1 sudo[67157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eafpanusrpxnvnlfzdfgrqpbjippbnpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415867.8482227-250-196331475481562/AnsiballZ_copy.py'
Jan 26 08:24:28 compute-1 sudo[67157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:29 compute-1 python3.9[67159]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415867.8482227-250-196331475481562/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:24:29 compute-1 sudo[67157]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:29 compute-1 sudo[67309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhrfyyojjnwswljrazggpbfdvtaqdxcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415869.3177853-280-251100556637078/AnsiballZ_systemd.py'
Jan 26 08:24:29 compute-1 sudo[67309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:30 compute-1 python3.9[67311]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:24:30 compute-1 systemd[1]: Reloading.
Jan 26 08:24:30 compute-1 systemd-rc-local-generator[67338]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:24:30 compute-1 systemd-sysv-generator[67342]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:24:30 compute-1 systemd[1]: Reloading.
Jan 26 08:24:30 compute-1 systemd-rc-local-generator[67376]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:24:30 compute-1 systemd-sysv-generator[67381]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:24:30 compute-1 systemd[1]: Starting EDPM Container Shutdown...
Jan 26 08:24:30 compute-1 systemd[1]: Finished EDPM Container Shutdown.
Jan 26 08:24:30 compute-1 sudo[67309]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:31 compute-1 sudo[67537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifvhwvzgnxpjuorpdddboarttcmfirmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415870.9612203-296-180874811905310/AnsiballZ_stat.py'
Jan 26 08:24:31 compute-1 sudo[67537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:31 compute-1 python3.9[67539]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:24:31 compute-1 sudo[67537]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:31 compute-1 sudo[67660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiqbvmellidzpbmvhqoctoddrenhcorj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415870.9612203-296-180874811905310/AnsiballZ_copy.py'
Jan 26 08:24:31 compute-1 sudo[67660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:32 compute-1 python3.9[67662]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415870.9612203-296-180874811905310/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:24:32 compute-1 sudo[67660]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:32 compute-1 sudo[67812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvmknptsdwukgvjokmizdajrvixpxobs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415872.3716443-326-234791803509721/AnsiballZ_stat.py'
Jan 26 08:24:32 compute-1 sudo[67812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:32 compute-1 python3.9[67814]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:24:32 compute-1 sudo[67812]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:33 compute-1 sudo[67935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipsfvqzsnodrbygyzzasgcdsuqyxkstt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415872.3716443-326-234791803509721/AnsiballZ_copy.py'
Jan 26 08:24:33 compute-1 sudo[67935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:33 compute-1 python3.9[67937]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415872.3716443-326-234791803509721/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:24:33 compute-1 sudo[67935]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:34 compute-1 sudo[68087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjtowplxlmfovlwkxspcakroadhohxhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415873.832458-356-257459001525755/AnsiballZ_systemd.py'
Jan 26 08:24:34 compute-1 sudo[68087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:34 compute-1 python3.9[68089]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:24:34 compute-1 systemd[1]: Reloading.
Jan 26 08:24:34 compute-1 systemd-rc-local-generator[68118]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:24:34 compute-1 systemd-sysv-generator[68122]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:24:34 compute-1 systemd[1]: Reloading.
Jan 26 08:24:34 compute-1 systemd-rc-local-generator[68155]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:24:34 compute-1 systemd-sysv-generator[68159]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:24:35 compute-1 systemd[1]: Starting Create netns directory...
Jan 26 08:24:35 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 08:24:35 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 08:24:35 compute-1 systemd[1]: Finished Create netns directory.
Jan 26 08:24:35 compute-1 sudo[68087]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:36 compute-1 python3.9[68318]: ansible-ansible.builtin.service_facts Invoked
Jan 26 08:24:36 compute-1 network[68335]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 08:24:36 compute-1 network[68336]: 'network-scripts' will be removed from distribution in near future.
Jan 26 08:24:36 compute-1 network[68337]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 08:24:42 compute-1 sudo[68597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmhtypitmbulurjagulnoufyyeakphma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415881.944589-388-68126099051850/AnsiballZ_systemd.py'
Jan 26 08:24:42 compute-1 sudo[68597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:42 compute-1 python3.9[68599]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:24:42 compute-1 systemd[1]: Reloading.
Jan 26 08:24:42 compute-1 systemd-rc-local-generator[68630]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:24:42 compute-1 systemd-sysv-generator[68633]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:24:42 compute-1 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 26 08:24:43 compute-1 iptables.init[68640]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 26 08:24:43 compute-1 iptables.init[68640]: iptables: Flushing firewall rules: [  OK  ]
Jan 26 08:24:43 compute-1 systemd[1]: iptables.service: Deactivated successfully.
Jan 26 08:24:43 compute-1 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 26 08:24:43 compute-1 sudo[68597]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:43 compute-1 sudo[68835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgqivxrbqeirxznpqcjrvzhbglnsfxqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415883.5542357-388-224953839326702/AnsiballZ_systemd.py'
Jan 26 08:24:43 compute-1 sudo[68835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:44 compute-1 python3.9[68837]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:24:44 compute-1 sudo[68835]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:45 compute-1 sudo[68989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrcysdxqjmyjifslkvdsgepbvwkptsru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415885.053118-420-85496041307925/AnsiballZ_systemd.py'
Jan 26 08:24:45 compute-1 sudo[68989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:45 compute-1 python3.9[68991]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:24:45 compute-1 systemd[1]: Reloading.
Jan 26 08:24:45 compute-1 systemd-rc-local-generator[69022]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:24:45 compute-1 systemd-sysv-generator[69025]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:24:46 compute-1 systemd[1]: Starting Netfilter Tables...
Jan 26 08:24:46 compute-1 systemd[1]: Finished Netfilter Tables.
Jan 26 08:24:46 compute-1 sudo[68989]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:46 compute-1 sudo[69182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcgfdyalrwhudyyoakkspimbemtkuuis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415886.4283037-436-65722626611831/AnsiballZ_command.py'
Jan 26 08:24:46 compute-1 sudo[69182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:47 compute-1 python3.9[69184]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:24:47 compute-1 sudo[69182]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:48 compute-1 sudo[69335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzbklyvrxzfdjqwuqiicxgkkeplzajnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415887.596056-464-173625200334613/AnsiballZ_stat.py'
Jan 26 08:24:48 compute-1 sudo[69335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:48 compute-1 python3.9[69337]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:24:48 compute-1 sudo[69335]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:48 compute-1 sudo[69460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frybimvjwdaefktgxzcppfqiblencbgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415887.596056-464-173625200334613/AnsiballZ_copy.py'
Jan 26 08:24:48 compute-1 sudo[69460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:48 compute-1 python3.9[69462]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769415887.596056-464-173625200334613/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:24:49 compute-1 sudo[69460]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:49 compute-1 sudo[69613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wykmbnqgfkcovusqxcjvlpmgkvdwvjhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415889.2133486-494-242566823857101/AnsiballZ_systemd.py'
Jan 26 08:24:49 compute-1 sudo[69613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:49 compute-1 python3.9[69615]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 08:24:49 compute-1 systemd[1]: Reloading OpenSSH server daemon...
Jan 26 08:24:49 compute-1 sshd[1007]: Received SIGHUP; restarting.
Jan 26 08:24:50 compute-1 sshd[1007]: Server listening on 0.0.0.0 port 22.
Jan 26 08:24:50 compute-1 sshd[1007]: Server listening on :: port 22.
Jan 26 08:24:50 compute-1 systemd[1]: Reloaded OpenSSH server daemon.
Jan 26 08:24:50 compute-1 sudo[69613]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:50 compute-1 sudo[69770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjjslapnfaiphfxactpsvjykxcfypikf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415890.2619703-510-231558559733904/AnsiballZ_file.py'
Jan 26 08:24:50 compute-1 sudo[69770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:50 compute-1 python3.9[69772]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:24:50 compute-1 sudo[69770]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:51 compute-1 sudo[69922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxpzgzachtfajepusizcboebjwlnmpvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415891.0666046-526-167602313764737/AnsiballZ_stat.py'
Jan 26 08:24:51 compute-1 sudo[69922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:51 compute-1 python3.9[69924]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:24:51 compute-1 sudo[69922]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:52 compute-1 sudo[70045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkreiyzkwkhbrmdliaejpeucryurkddf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415891.0666046-526-167602313764737/AnsiballZ_copy.py'
Jan 26 08:24:52 compute-1 sudo[70045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:52 compute-1 python3.9[70047]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415891.0666046-526-167602313764737/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:24:52 compute-1 sudo[70045]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:53 compute-1 sudo[70197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwekxdvxnkyewofglngnjbsddodqcmnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415892.7716148-562-49384840914030/AnsiballZ_timezone.py'
Jan 26 08:24:53 compute-1 sudo[70197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:53 compute-1 python3.9[70199]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 26 08:24:53 compute-1 systemd[1]: Starting Time & Date Service...
Jan 26 08:24:53 compute-1 systemd[1]: Started Time & Date Service.
Jan 26 08:24:53 compute-1 sudo[70197]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:54 compute-1 sudo[70353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iebfqjnouquzmjloexoengozfsrjtujk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415893.9786978-580-8746578127363/AnsiballZ_file.py'
Jan 26 08:24:54 compute-1 sudo[70353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:54 compute-1 python3.9[70355]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:24:54 compute-1 sudo[70353]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:55 compute-1 sudo[70505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skydbbvemmllhduwrlfjpjhcpfhbqerj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415894.7769327-596-240123660918543/AnsiballZ_stat.py'
Jan 26 08:24:55 compute-1 sudo[70505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:55 compute-1 python3.9[70507]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:24:55 compute-1 sudo[70505]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:55 compute-1 sudo[70628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arbgovxxzcfkeifebiodqikmmzfavyan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415894.7769327-596-240123660918543/AnsiballZ_copy.py'
Jan 26 08:24:55 compute-1 sudo[70628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:56 compute-1 python3.9[70630]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769415894.7769327-596-240123660918543/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:24:56 compute-1 sudo[70628]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:56 compute-1 sudo[70780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbqmxxhzuplmgfrnxfqqrcjmyfxjkvay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415896.2841198-626-211961541213942/AnsiballZ_stat.py'
Jan 26 08:24:56 compute-1 sudo[70780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:56 compute-1 python3.9[70782]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:24:56 compute-1 sudo[70780]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:57 compute-1 sudo[70903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkntbbtxblnvrgtnmfflvifrvzrgxxnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415896.2841198-626-211961541213942/AnsiballZ_copy.py'
Jan 26 08:24:57 compute-1 sudo[70903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:57 compute-1 python3.9[70905]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769415896.2841198-626-211961541213942/.source.yaml _original_basename=.0zijiw3g follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:24:57 compute-1 sudo[70903]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:58 compute-1 sudo[71055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdxstjztucmafoyongvqyhtwlcwavobn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415897.7614095-656-163106234149146/AnsiballZ_stat.py'
Jan 26 08:24:58 compute-1 sudo[71055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:58 compute-1 python3.9[71057]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:24:58 compute-1 sudo[71055]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:58 compute-1 sudo[71178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdnwcrkurhmvsunxxmjqnxjgwjvgbzgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415897.7614095-656-163106234149146/AnsiballZ_copy.py'
Jan 26 08:24:58 compute-1 sudo[71178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:58 compute-1 python3.9[71180]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415897.7614095-656-163106234149146/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:24:58 compute-1 sudo[71178]: pam_unix(sudo:session): session closed for user root
Jan 26 08:24:59 compute-1 sudo[71330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqlublhgcjcmbjomacwvvmpdpauzxfwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415899.1943815-686-191832098175186/AnsiballZ_command.py'
Jan 26 08:24:59 compute-1 sudo[71330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:24:59 compute-1 python3.9[71332]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:24:59 compute-1 sudo[71330]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:00 compute-1 sudo[71483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmytfnajgegvohhvernaqynxgtkgzidg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415900.037804-702-143572612364757/AnsiballZ_command.py'
Jan 26 08:25:00 compute-1 sudo[71483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:00 compute-1 python3.9[71485]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:25:00 compute-1 sudo[71483]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:01 compute-1 sudo[71636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rotyadatcznhbjuxdjavnybwodcaislc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769415900.8966167-718-217805613056362/AnsiballZ_edpm_nftables_from_files.py'
Jan 26 08:25:01 compute-1 sudo[71636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:01 compute-1 python3[71638]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 08:25:01 compute-1 sudo[71636]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:02 compute-1 sudo[71788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqskjbnzfazrvdxcazsjbdyxrcmywpbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415901.8367019-735-165336514394950/AnsiballZ_stat.py'
Jan 26 08:25:02 compute-1 sudo[71788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:02 compute-1 python3.9[71790]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:25:02 compute-1 sudo[71788]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:02 compute-1 sudo[71911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfavxhxvfdbqmkxzsftzadwnpmulzpol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415901.8367019-735-165336514394950/AnsiballZ_copy.py'
Jan 26 08:25:02 compute-1 sudo[71911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:03 compute-1 python3.9[71913]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415901.8367019-735-165336514394950/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:25:03 compute-1 sudo[71911]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:03 compute-1 sudo[72063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jadyajdmqxlzezbutukfaplnhiloheay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415903.3407135-764-199851947324028/AnsiballZ_stat.py'
Jan 26 08:25:03 compute-1 sudo[72063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:03 compute-1 python3.9[72065]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:25:03 compute-1 sudo[72063]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:04 compute-1 sudo[72186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcahfgoigodvrqanvvdjeapudvlllqjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415903.3407135-764-199851947324028/AnsiballZ_copy.py'
Jan 26 08:25:04 compute-1 sudo[72186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:04 compute-1 python3.9[72188]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415903.3407135-764-199851947324028/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:25:04 compute-1 sudo[72186]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:05 compute-1 sudo[72338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaaagzzaysbuhpqpbrsmyugifsfrcfzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415904.709919-794-85596090311890/AnsiballZ_stat.py'
Jan 26 08:25:05 compute-1 sudo[72338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:05 compute-1 python3.9[72340]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:25:05 compute-1 sudo[72338]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:05 compute-1 sudo[72461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvkfqwtaspcyyjowscucqjfgdqqptqpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415904.709919-794-85596090311890/AnsiballZ_copy.py'
Jan 26 08:25:05 compute-1 sudo[72461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:05 compute-1 python3.9[72463]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415904.709919-794-85596090311890/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:25:05 compute-1 sudo[72461]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:06 compute-1 sudo[72613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paypiogzrnnglwoohnwflwqlzztjgblg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415906.183046-824-80270662799325/AnsiballZ_stat.py'
Jan 26 08:25:06 compute-1 sudo[72613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:06 compute-1 python3.9[72615]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:25:06 compute-1 sudo[72613]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:07 compute-1 sudo[72736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqbfxegrcushztbudfpwbxufydxzzlwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415906.183046-824-80270662799325/AnsiballZ_copy.py'
Jan 26 08:25:07 compute-1 sudo[72736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:07 compute-1 python3.9[72738]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415906.183046-824-80270662799325/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:25:07 compute-1 sudo[72736]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:08 compute-1 sudo[72888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtelbzynxvuhmwyrbhfeeurulvnzdpgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415907.691423-854-65931912819356/AnsiballZ_stat.py'
Jan 26 08:25:08 compute-1 sudo[72888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:08 compute-1 python3.9[72890]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:25:08 compute-1 sudo[72888]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:08 compute-1 sudo[73011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsolyozguhjrvfubjeqdjaixqnmxxrxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415907.691423-854-65931912819356/AnsiballZ_copy.py'
Jan 26 08:25:08 compute-1 sudo[73011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:08 compute-1 python3.9[73013]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415907.691423-854-65931912819356/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:25:08 compute-1 sudo[73011]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:09 compute-1 sudo[73163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlkpseplsacpbdtxkesnrfdssxffynxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415909.1620367-884-3700199060233/AnsiballZ_file.py'
Jan 26 08:25:09 compute-1 sudo[73163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:09 compute-1 python3.9[73165]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:25:09 compute-1 sudo[73163]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:10 compute-1 sudo[73315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pckdadeyikhyehyujjbrwqvxfdgduvko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415909.995899-900-22051243808122/AnsiballZ_command.py'
Jan 26 08:25:10 compute-1 sudo[73315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:10 compute-1 python3.9[73317]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:25:10 compute-1 sudo[73315]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:11 compute-1 sudo[73474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xalhgazsyymcsopqmnugbwzuxvzafzuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415910.873843-916-3431404808835/AnsiballZ_blockinfile.py'
Jan 26 08:25:11 compute-1 sudo[73474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:11 compute-1 python3.9[73476]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:25:11 compute-1 sudo[73474]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:12 compute-1 sudo[73627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsmipjiaziocwaxngpeqdowpzxvktupk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415911.8914225-934-119195214697852/AnsiballZ_file.py'
Jan 26 08:25:12 compute-1 sudo[73627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:12 compute-1 python3.9[73629]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:25:12 compute-1 sudo[73627]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:13 compute-1 sudo[73779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmqwarjuifgofrsuijiwiatdumlzfifu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415912.6891754-934-256142353889490/AnsiballZ_file.py'
Jan 26 08:25:13 compute-1 sudo[73779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:13 compute-1 python3.9[73781]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:25:13 compute-1 sudo[73779]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:13 compute-1 sudo[73931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uznyfczsplmxpxmhfwokditvwsqichzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415913.441345-965-237350895128676/AnsiballZ_mount.py'
Jan 26 08:25:13 compute-1 sudo[73931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:14 compute-1 python3.9[73933]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 26 08:25:14 compute-1 sudo[73931]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:14 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 08:25:14 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 08:25:14 compute-1 sudo[74085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjwjnutkicjwyuvbjuwdiqzjzemixlcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415914.330999-965-192310026643995/AnsiballZ_mount.py'
Jan 26 08:25:14 compute-1 sudo[74085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:14 compute-1 python3.9[74087]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 26 08:25:14 compute-1 sudo[74085]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:15 compute-1 sshd-session[64921]: Connection closed by 192.168.122.30 port 57836
Jan 26 08:25:15 compute-1 sshd-session[64918]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:25:15 compute-1 systemd[1]: session-16.scope: Deactivated successfully.
Jan 26 08:25:15 compute-1 systemd[1]: session-16.scope: Consumed 45.212s CPU time.
Jan 26 08:25:15 compute-1 systemd-logind[788]: Session 16 logged out. Waiting for processes to exit.
Jan 26 08:25:15 compute-1 systemd-logind[788]: Removed session 16.
Jan 26 08:25:20 compute-1 sshd-session[74113]: Accepted publickey for zuul from 192.168.122.30 port 33776 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:25:20 compute-1 systemd-logind[788]: New session 17 of user zuul.
Jan 26 08:25:20 compute-1 systemd[1]: Started Session 17 of User zuul.
Jan 26 08:25:20 compute-1 sshd-session[74113]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:25:21 compute-1 sudo[74266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibmdlfpbviyulgqtxxzplmfavgsminmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415920.6119964-18-264819014045512/AnsiballZ_tempfile.py'
Jan 26 08:25:21 compute-1 sudo[74266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:21 compute-1 python3.9[74268]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 26 08:25:21 compute-1 sudo[74266]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:22 compute-1 sudo[74418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqdzwlrqlbpyupxhvxtobsnwvnmgimpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415921.663752-42-177943431942549/AnsiballZ_stat.py'
Jan 26 08:25:22 compute-1 sudo[74418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:22 compute-1 python3.9[74420]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:25:22 compute-1 sudo[74418]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:23 compute-1 sudo[74570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkltnzujyqzbraxenacmyeccgigmjklm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415922.7578716-62-262158916223985/AnsiballZ_setup.py'
Jan 26 08:25:23 compute-1 sudo[74570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:23 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 08:25:23 compute-1 python3.9[74572]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:25:23 compute-1 sudo[74570]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:24 compute-1 sudo[74724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfeoggewtbkorzvpvcogjescnqrfxurd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415924.1942408-79-46479795050798/AnsiballZ_blockinfile.py'
Jan 26 08:25:24 compute-1 sudo[74724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:24 compute-1 python3.9[74726]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDk8BMnBs0we57Q33K53E/UoNpzrishEfxqhA3CP62aUrO7JagHxTaJtFdXy+aPeWS0c9WbKWm8fmfp/TyydzIYJr04BRI+Yn0onP4jknh+f7jztRluXfDRCta7Qk8guUJZ+2pkc/0Hnatq9EkPQf9u23SCvJLCkfgx+fsWY64AoWVH2Kt4yB07jAjMJDH9zCvBbI6S6Nf1mfWaX0+/E/1MBDWWBSVOlma2cgVubQQYwrBY5JG+lf/ewli6+l9pE0oQWBusN57nldSRaIam7QpIefZ/i6GjTMphhtSr2TsB+MUdjJU113y5+/Ss2eDxL2LeXxY2b2DRrklHKdZLpnyi9eV6GTWw6pvUA+H6XPot1MyTobyi4WiWATiwf3wgEDdYFRwK0J2oLtRFWLNmmaPC73ZlbPrLK9G79+JO5pylwMzbhFLFn8GA0A44iY/3u+m5FrjJUmONXU4WMRrGdRwJl+ZH++KE9PCLdZDp+wEWMvsVcy/e5aebFjqTPUOISu0=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINfnydN6QTcKeiz6klNhdRzoul4zARP9IODK/s1XQGtt
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBChM2jALrp6emE8VLVenj6vHziW5loM8ogeaX6uw0REesxJu++KugUYHHInGdiys8xDo4zeCnWVv4lpUl6ts1tw=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCuhvBZbVwdHlUJIXZK2ydY8cWPQ3rdTSo6pE5LMs252an/wlJaYLePB84fRq7VKER48wdg5/NZLb9Knc3R3yxO412ANyKBIfE0AvESUfH9exPONyV+j06prPMDWzFMCnCggKd+9aaZCqg/ZFaabyqE9P0Ti77imQCUwXWqJv0JIj7R/PeVNrfFDhwS7lQ+e+yBv4L8cFOKsKI/EMtg00hj1pxJO6gqbxgUz+9xjjt7xrTqGcd/13EkPnz/sg6zmgfMimFyDfhKuh7yBLeirew9K+RDyY3jwNykpclPBVV9qJXZw3tQMxMeZ0Yb1Lxw3spzXwdbFVCdccix+CRp+8nvbbhYlRS4R4YwvokVw7u4Hh+LhBbOPAIwltQEahv/7QXfC+SXDwH2FBRlDZhUVq93qeWfbMV3GbVavFUE07LMFsBaSkT06TXAst18Ajyc+n7RGrWWgtfQDeDX/a8NQy1GrneLd6BlpsJf1Vq+76luLUwbx2O8PXxCtRxwjDqsX1E=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKBEa0hbG/gfBdxpxnd3mdjgZNbE5GswrJ5sPCK9H18x
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBON0GMKOQvKQm6aNDjXqYQ6W/fw60dR8T1JKqDwlcWeZziGHvaBgy0zWynyJVM1d3PQI0g7w0qV1hQLhXQZ/k9I=
                                             create=True mode=0644 path=/tmp/ansible.00oeylbt state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:25:24 compute-1 sudo[74724]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:25 compute-1 sudo[74876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzzgbazoabmnxpmbtolrsobtjluusadd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415925.0681067-95-164466283450266/AnsiballZ_command.py'
Jan 26 08:25:25 compute-1 sudo[74876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:25 compute-1 python3.9[74878]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.00oeylbt' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:25:25 compute-1 sudo[74876]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:26 compute-1 sudo[75030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cepobibsqodycatwnuhcsfjtghtflwwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415926.0270798-111-147036912492504/AnsiballZ_file.py'
Jan 26 08:25:26 compute-1 sudo[75030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:26 compute-1 python3.9[75032]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.00oeylbt state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:25:26 compute-1 sudo[75030]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:27 compute-1 sshd-session[74116]: Connection closed by 192.168.122.30 port 33776
Jan 26 08:25:27 compute-1 sshd-session[74113]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:25:27 compute-1 systemd[1]: session-17.scope: Deactivated successfully.
Jan 26 08:25:27 compute-1 systemd[1]: session-17.scope: Consumed 4.466s CPU time.
Jan 26 08:25:27 compute-1 systemd-logind[788]: Session 17 logged out. Waiting for processes to exit.
Jan 26 08:25:27 compute-1 systemd-logind[788]: Removed session 17.
Jan 26 08:25:32 compute-1 sshd-session[75057]: Accepted publickey for zuul from 192.168.122.30 port 54428 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:25:32 compute-1 systemd-logind[788]: New session 18 of user zuul.
Jan 26 08:25:32 compute-1 systemd[1]: Started Session 18 of User zuul.
Jan 26 08:25:32 compute-1 sshd-session[75057]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:25:33 compute-1 python3.9[75210]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:25:34 compute-1 sudo[75364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrigpzqhyywvczfucojlyiprwlugizxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415934.188489-40-280577063959215/AnsiballZ_systemd.py'
Jan 26 08:25:34 compute-1 sudo[75364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:35 compute-1 python3.9[75366]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 26 08:25:35 compute-1 sudo[75364]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:35 compute-1 sudo[75518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-entixaxhksukzpuadoaekvqghvfzzzcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415935.4290452-56-177914793105423/AnsiballZ_systemd.py'
Jan 26 08:25:35 compute-1 sudo[75518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:36 compute-1 python3.9[75520]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 08:25:36 compute-1 sudo[75518]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:36 compute-1 sudo[75671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyactoljkugpetpybocmlrfmphbosszp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415936.4550073-74-32758459241655/AnsiballZ_command.py'
Jan 26 08:25:36 compute-1 sudo[75671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:37 compute-1 python3.9[75673]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:25:37 compute-1 sudo[75671]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:38 compute-1 sudo[75824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eaohuwueyqitzburljuhamuwvfzqypef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415937.4219415-90-166913256829745/AnsiballZ_stat.py'
Jan 26 08:25:38 compute-1 sudo[75824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:38 compute-1 python3.9[75826]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:25:38 compute-1 sudo[75824]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:38 compute-1 sudo[75978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzzuejhscomvnvjikduoexfrsgfmzzby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415938.5750158-106-164337678909480/AnsiballZ_command.py'
Jan 26 08:25:38 compute-1 sudo[75978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:39 compute-1 python3.9[75980]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:25:39 compute-1 sudo[75978]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:39 compute-1 sudo[76133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtyhhzxtflayslomfxzcgznamvpiolqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415939.4286292-122-275717829231904/AnsiballZ_file.py'
Jan 26 08:25:39 compute-1 sudo[76133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:40 compute-1 python3.9[76135]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:25:40 compute-1 sudo[76133]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:40 compute-1 sshd-session[75060]: Connection closed by 192.168.122.30 port 54428
Jan 26 08:25:40 compute-1 sshd-session[75057]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:25:40 compute-1 systemd[1]: session-18.scope: Deactivated successfully.
Jan 26 08:25:40 compute-1 systemd[1]: session-18.scope: Consumed 5.540s CPU time.
Jan 26 08:25:40 compute-1 systemd-logind[788]: Session 18 logged out. Waiting for processes to exit.
Jan 26 08:25:40 compute-1 systemd-logind[788]: Removed session 18.
Jan 26 08:25:46 compute-1 sshd-session[76161]: Accepted publickey for zuul from 192.168.122.30 port 45838 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:25:46 compute-1 systemd-logind[788]: New session 19 of user zuul.
Jan 26 08:25:46 compute-1 systemd[1]: Started Session 19 of User zuul.
Jan 26 08:25:46 compute-1 sshd-session[76161]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:25:47 compute-1 python3.9[76314]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:25:48 compute-1 sudo[76468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cirezpbpqdblleyavaptoyjxcrimgpeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415948.0202003-44-101728545148868/AnsiballZ_setup.py'
Jan 26 08:25:48 compute-1 sudo[76468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:48 compute-1 python3.9[76470]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 08:25:48 compute-1 sudo[76468]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:49 compute-1 sudo[76552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcezpfjxasghnhbswcyzsopfliavtofl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415948.0202003-44-101728545148868/AnsiballZ_dnf.py'
Jan 26 08:25:49 compute-1 sudo[76552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:25:49 compute-1 python3.9[76554]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 08:25:50 compute-1 sudo[76552]: pam_unix(sudo:session): session closed for user root
Jan 26 08:25:51 compute-1 python3.9[76705]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:25:53 compute-1 python3.9[76856]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 08:25:54 compute-1 python3.9[77006]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:25:55 compute-1 python3.9[77156]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:25:55 compute-1 sshd-session[76164]: Connection closed by 192.168.122.30 port 45838
Jan 26 08:25:55 compute-1 sshd-session[76161]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:25:55 compute-1 systemd[1]: session-19.scope: Deactivated successfully.
Jan 26 08:25:55 compute-1 systemd[1]: session-19.scope: Consumed 6.814s CPU time.
Jan 26 08:25:55 compute-1 systemd-logind[788]: Session 19 logged out. Waiting for processes to exit.
Jan 26 08:25:55 compute-1 systemd-logind[788]: Removed session 19.
Jan 26 08:26:00 compute-1 sshd-session[77181]: Accepted publickey for zuul from 192.168.122.30 port 51808 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:26:00 compute-1 systemd-logind[788]: New session 20 of user zuul.
Jan 26 08:26:00 compute-1 systemd[1]: Started Session 20 of User zuul.
Jan 26 08:26:00 compute-1 sshd-session[77181]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:26:02 compute-1 python3.9[77334]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:26:03 compute-1 sudo[77488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osjbahokzbixdxbvwrzgswsdstoysftk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415963.340697-75-218547167400781/AnsiballZ_file.py'
Jan 26 08:26:03 compute-1 sudo[77488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:04 compute-1 python3.9[77490]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:26:04 compute-1 sudo[77488]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:04 compute-1 sudo[77640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aykfquldadqsnjjefqtudzzpgwhldoui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415964.2743917-75-184321882868140/AnsiballZ_file.py'
Jan 26 08:26:04 compute-1 sudo[77640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:04 compute-1 python3.9[77642]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:26:04 compute-1 sudo[77640]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:05 compute-1 sudo[77792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujubdbgqetcvldlzorseaspqxsxcyovl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415965.085794-107-184579994446509/AnsiballZ_stat.py'
Jan 26 08:26:05 compute-1 sudo[77792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:05 compute-1 python3.9[77794]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:05 compute-1 sudo[77792]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:06 compute-1 sudo[77915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvmeqfjrnezhtqwieprnkzopajjklbln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415965.085794-107-184579994446509/AnsiballZ_copy.py'
Jan 26 08:26:06 compute-1 sudo[77915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:06 compute-1 python3.9[77917]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415965.085794-107-184579994446509/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=ecbc2fe6fa721ac2f3540110eab0937d9c0dd252 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:06 compute-1 sudo[77915]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:07 compute-1 sudo[78067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbftbxvnjzfbhruiruvttvjgkzrdrpgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415966.6390734-107-233993934002006/AnsiballZ_stat.py'
Jan 26 08:26:07 compute-1 sudo[78067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:07 compute-1 python3.9[78069]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:07 compute-1 sudo[78067]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:07 compute-1 sudo[78190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuvfrmofmnmhtlwjlrszzagbsgiamwwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415966.6390734-107-233993934002006/AnsiballZ_copy.py'
Jan 26 08:26:07 compute-1 sudo[78190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:07 compute-1 python3.9[78192]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415966.6390734-107-233993934002006/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=2cea1638c63da2a23393f014bc3f6aaa05f50dec backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:07 compute-1 sudo[78190]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:08 compute-1 sudo[78342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghomjxopivzjrgfbyspmoukcchsluzgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415968.0545735-107-281467296199287/AnsiballZ_stat.py'
Jan 26 08:26:08 compute-1 sudo[78342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:08 compute-1 python3.9[78344]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:08 compute-1 sudo[78342]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:09 compute-1 sudo[78465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdfkuaslftgkzljgqzbrlpzmhhxbjoac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415968.0545735-107-281467296199287/AnsiballZ_copy.py'
Jan 26 08:26:09 compute-1 sudo[78465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:09 compute-1 python3.9[78467]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415968.0545735-107-281467296199287/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=c86f3598f51f38e66cb1e725145f0f9a437e39c2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:09 compute-1 sudo[78465]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:09 compute-1 sudo[78617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqrprkubssachelvghcgqyvnmzouavhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415969.5606136-198-20880933048185/AnsiballZ_file.py'
Jan 26 08:26:09 compute-1 sudo[78617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:10 compute-1 python3.9[78619]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:26:10 compute-1 sudo[78617]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:10 compute-1 sudo[78769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssbmakejgdedonaglogafbmthndwuhfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415970.3636403-198-147238441638525/AnsiballZ_file.py'
Jan 26 08:26:10 compute-1 sudo[78769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:10 compute-1 python3.9[78771]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:26:10 compute-1 sudo[78769]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:11 compute-1 sudo[78921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpecmcnauqbkveugbmucephxysegnrye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415971.1350179-229-88426117980851/AnsiballZ_stat.py'
Jan 26 08:26:11 compute-1 sudo[78921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:11 compute-1 python3.9[78923]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:11 compute-1 sudo[78921]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:12 compute-1 sudo[79044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rndevkypbuvgsytpyzmlydewxeymlhmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415971.1350179-229-88426117980851/AnsiballZ_copy.py'
Jan 26 08:26:12 compute-1 sudo[79044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:12 compute-1 python3.9[79046]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415971.1350179-229-88426117980851/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=bad12d7264b104206348aabacd3b201f9c3da246 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:12 compute-1 sudo[79044]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:12 compute-1 sudo[79196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wecjyhetpifgfbvmqttyiwrdgmuagqsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415972.4073882-229-179843255124291/AnsiballZ_stat.py'
Jan 26 08:26:12 compute-1 sudo[79196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:12 compute-1 python3.9[79198]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:12 compute-1 sudo[79196]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:13 compute-1 sudo[79319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pooyhgpfuvuytyfdxqsrptihinpepnlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415972.4073882-229-179843255124291/AnsiballZ_copy.py'
Jan 26 08:26:13 compute-1 sudo[79319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:13 compute-1 python3.9[79321]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415972.4073882-229-179843255124291/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=e0a8df58fa44fb6f18c22f0d66219e3bff78b28a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:13 compute-1 sudo[79319]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:14 compute-1 sudo[79471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghhxkkbqtowdmforvloakpkxpcrpwbbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415973.8456588-229-155294257680568/AnsiballZ_stat.py'
Jan 26 08:26:14 compute-1 sudo[79471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:14 compute-1 python3.9[79473]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:14 compute-1 sudo[79471]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:14 compute-1 sudo[79594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzmduxkevgwlvvopgucotzmeungmdtnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415973.8456588-229-155294257680568/AnsiballZ_copy.py'
Jan 26 08:26:14 compute-1 sudo[79594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:15 compute-1 python3.9[79596]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415973.8456588-229-155294257680568/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=63ba538171d73005ff8d188f153d315d5ec92496 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:15 compute-1 sudo[79594]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:15 compute-1 sudo[79746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkdaulkvhfoegqkbfohwjawqdomthkmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415975.3479497-317-141704971218783/AnsiballZ_file.py'
Jan 26 08:26:15 compute-1 sudo[79746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:15 compute-1 python3.9[79748]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:26:15 compute-1 sudo[79746]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:16 compute-1 sudo[79898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypyarqgapvjpexgtdnumzfwzdkkwofhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415976.1179469-317-92811575621010/AnsiballZ_file.py'
Jan 26 08:26:16 compute-1 sudo[79898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:16 compute-1 python3.9[79900]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:26:16 compute-1 sudo[79898]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:17 compute-1 sudo[80050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdqsqoxgxdtbtqvfaapycddwgraomseg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415976.9303606-346-57534246865550/AnsiballZ_stat.py'
Jan 26 08:26:17 compute-1 sudo[80050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:17 compute-1 python3.9[80052]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:17 compute-1 sudo[80050]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:17 compute-1 sudo[80173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyqhaeotjijqolbygescwrrueptjxcsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415976.9303606-346-57534246865550/AnsiballZ_copy.py'
Jan 26 08:26:17 compute-1 sudo[80173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:18 compute-1 python3.9[80175]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415976.9303606-346-57534246865550/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=3ffb3329c14d1d8748798516afb1b374be8aede6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:18 compute-1 sudo[80173]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:18 compute-1 sudo[80325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndpssoxbzdugbsxtgmhgpgpjxtxponki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415978.3576896-346-221452571724432/AnsiballZ_stat.py'
Jan 26 08:26:18 compute-1 sudo[80325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:18 compute-1 python3.9[80327]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:18 compute-1 sudo[80325]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:19 compute-1 chronyd[64892]: Selected source 173.206.123.141 (pool.ntp.org)
Jan 26 08:26:19 compute-1 sudo[80448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-einqfrlktalhfmscfizcqnawontvxjya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415978.3576896-346-221452571724432/AnsiballZ_copy.py'
Jan 26 08:26:19 compute-1 sudo[80448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:19 compute-1 python3.9[80450]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415978.3576896-346-221452571724432/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=b2389dc330ee037d909460111a0f98d618f245f8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:19 compute-1 sudo[80448]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:20 compute-1 sudo[80600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppqvagwzbeqlevqzmnsgjvxppvszukts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415979.8007731-346-125089627850091/AnsiballZ_stat.py'
Jan 26 08:26:20 compute-1 sudo[80600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:20 compute-1 python3.9[80602]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:20 compute-1 sudo[80600]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:20 compute-1 sudo[80723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmuobgamtxdgolpyqjsnykxtzfqbjtum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415979.8007731-346-125089627850091/AnsiballZ_copy.py'
Jan 26 08:26:20 compute-1 sudo[80723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:21 compute-1 python3.9[80725]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415979.8007731-346-125089627850091/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=d4738aee61e211b82253e4773b5862190e4be04d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:21 compute-1 sudo[80723]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:21 compute-1 sudo[80875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smupnpitbhfebdyafygxcnejppeeilwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415981.258568-436-241519328497493/AnsiballZ_file.py'
Jan 26 08:26:21 compute-1 sudo[80875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:21 compute-1 python3.9[80877]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:26:21 compute-1 sudo[80875]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:22 compute-1 sudo[81027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljvfvrdpyrsiklxvfnqjfhczjwpvhwja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415982.0508354-436-25028203828466/AnsiballZ_file.py'
Jan 26 08:26:22 compute-1 sudo[81027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:22 compute-1 python3.9[81029]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:26:22 compute-1 sudo[81027]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:23 compute-1 sudo[81179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jojkcgadvrsltliysimqcovxnscfugnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415982.8812876-466-275699938973595/AnsiballZ_stat.py'
Jan 26 08:26:23 compute-1 sudo[81179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:23 compute-1 python3.9[81181]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:23 compute-1 sudo[81179]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:23 compute-1 sudo[81302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwugazrlgnlxrhdtukuxocbildqdgmks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415982.8812876-466-275699938973595/AnsiballZ_copy.py'
Jan 26 08:26:23 compute-1 sudo[81302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:24 compute-1 python3.9[81304]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415982.8812876-466-275699938973595/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=cdc9e4566cddfd48b175fb21baf8f90b194ab2ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:24 compute-1 sudo[81302]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:24 compute-1 sudo[81454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmljwzmbdtgjusmbvavilozsbztuukbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415984.2432508-466-265923828094757/AnsiballZ_stat.py'
Jan 26 08:26:24 compute-1 sudo[81454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:24 compute-1 python3.9[81456]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:24 compute-1 sudo[81454]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:25 compute-1 sudo[81577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctejxrkfypqzfqaxofytbgfjzbbremdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415984.2432508-466-265923828094757/AnsiballZ_copy.py'
Jan 26 08:26:25 compute-1 sudo[81577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:25 compute-1 python3.9[81579]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415984.2432508-466-265923828094757/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=b2389dc330ee037d909460111a0f98d618f245f8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:25 compute-1 sudo[81577]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:26 compute-1 sudo[81729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucejmqknisunvhfwuatnzcmelhmiyxgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415985.7400353-466-73685969298062/AnsiballZ_stat.py'
Jan 26 08:26:26 compute-1 sudo[81729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:26 compute-1 python3.9[81731]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:26 compute-1 sudo[81729]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:26 compute-1 sudo[81852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjrrvjhymacretydwnqjitmqlrmidkep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415985.7400353-466-73685969298062/AnsiballZ_copy.py'
Jan 26 08:26:26 compute-1 sudo[81852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:26 compute-1 python3.9[81854]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415985.7400353-466-73685969298062/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=5d4a9d4e865b16e10d952d9c38bcefd212e225c1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:26 compute-1 sudo[81852]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:28 compute-1 sudo[82004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wivsjerkwagwpooxzvyzomeriuminqkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415987.6813202-586-277202063991218/AnsiballZ_file.py'
Jan 26 08:26:28 compute-1 sudo[82004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:28 compute-1 python3.9[82006]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:26:28 compute-1 sudo[82004]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:28 compute-1 sudo[82156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krwsutunklrqrdpwjbpgzeerwnjxnjqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415988.479994-602-109705716423702/AnsiballZ_stat.py'
Jan 26 08:26:28 compute-1 sudo[82156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:29 compute-1 python3.9[82158]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:29 compute-1 sudo[82156]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:29 compute-1 sudo[82279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvynsnjjyapbghjxuzffaqqtagockgli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415988.479994-602-109705716423702/AnsiballZ_copy.py'
Jan 26 08:26:29 compute-1 sudo[82279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:29 compute-1 python3.9[82281]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415988.479994-602-109705716423702/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=70539153ba8394cf6e23efa475ecc78b911f2b37 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:29 compute-1 sudo[82279]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:30 compute-1 sudo[82431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvtwxfdsbcroedkeljwudmdmlcthennn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415990.0722818-634-225886970457220/AnsiballZ_file.py'
Jan 26 08:26:30 compute-1 sudo[82431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:30 compute-1 python3.9[82433]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:26:30 compute-1 sudo[82431]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:31 compute-1 sudo[82583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qocpfbbvnalaqtdrgrwmxlgghyeongvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415990.918506-652-114742254247960/AnsiballZ_stat.py'
Jan 26 08:26:31 compute-1 sudo[82583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:31 compute-1 python3.9[82585]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:31 compute-1 sudo[82583]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:31 compute-1 sudo[82706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idyvgczwpbusqelympgsmtpfliccmuzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415990.918506-652-114742254247960/AnsiballZ_copy.py'
Jan 26 08:26:31 compute-1 sudo[82706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:32 compute-1 python3.9[82708]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415990.918506-652-114742254247960/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=70539153ba8394cf6e23efa475ecc78b911f2b37 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:32 compute-1 sudo[82706]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:32 compute-1 sudo[82858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnpfembdqhkkztvnnstiyaltbuuaybbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415992.2473829-682-12648965785131/AnsiballZ_file.py'
Jan 26 08:26:32 compute-1 sudo[82858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:32 compute-1 python3.9[82860]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:26:32 compute-1 sudo[82858]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:33 compute-1 sudo[83010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtfkfhnzykyxknkeyliyhsebxbshghvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415992.977324-701-197767412145009/AnsiballZ_stat.py'
Jan 26 08:26:33 compute-1 sudo[83010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:33 compute-1 python3.9[83012]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:33 compute-1 sudo[83010]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:33 compute-1 sudo[83133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjzvnawnxkrwwctoluzokzmrmyjcljrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415992.977324-701-197767412145009/AnsiballZ_copy.py'
Jan 26 08:26:33 compute-1 sudo[83133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:34 compute-1 python3.9[83135]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415992.977324-701-197767412145009/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=70539153ba8394cf6e23efa475ecc78b911f2b37 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:34 compute-1 sudo[83133]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:34 compute-1 sudo[83285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjojzndzphpgtbmalptmomevdyfojbel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415994.464232-736-148857492749208/AnsiballZ_file.py'
Jan 26 08:26:34 compute-1 sudo[83285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:35 compute-1 python3.9[83287]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:26:35 compute-1 sudo[83285]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:35 compute-1 sudo[83437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmwjqpubrbenrryfgryovvqjvoalhiqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415995.2888203-752-109593810346146/AnsiballZ_stat.py'
Jan 26 08:26:35 compute-1 sudo[83437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:35 compute-1 python3.9[83439]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:35 compute-1 sudo[83437]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:36 compute-1 sudo[83560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkmnsjqbyulimfrfpqoqqpavbotqvjfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415995.2888203-752-109593810346146/AnsiballZ_copy.py'
Jan 26 08:26:36 compute-1 sudo[83560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:36 compute-1 python3.9[83562]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415995.2888203-752-109593810346146/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=70539153ba8394cf6e23efa475ecc78b911f2b37 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:36 compute-1 sudo[83560]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:37 compute-1 sudo[83712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdjbgcnhjepgzxcvgwvfdzagqxjcwxqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415996.8588183-785-127370949841663/AnsiballZ_file.py'
Jan 26 08:26:37 compute-1 sudo[83712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:37 compute-1 python3.9[83714]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:26:37 compute-1 sudo[83712]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:38 compute-1 sudo[83864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipbvseqksjukwbblqnmddpdjeiotwkkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415997.6761093-801-246751226085169/AnsiballZ_stat.py'
Jan 26 08:26:38 compute-1 sudo[83864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:38 compute-1 python3.9[83866]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:38 compute-1 sudo[83864]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:38 compute-1 sudo[83987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gelhyumrfjpejpohkuaeulzesktuqkbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415997.6761093-801-246751226085169/AnsiballZ_copy.py'
Jan 26 08:26:38 compute-1 sudo[83987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:38 compute-1 python3.9[83989]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769415997.6761093-801-246751226085169/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=70539153ba8394cf6e23efa475ecc78b911f2b37 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:39 compute-1 sudo[83987]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:39 compute-1 sudo[84139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgyiuetcgexsdxgmjnbvkvxikinxoncg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769415999.3110855-834-259022805653753/AnsiballZ_file.py'
Jan 26 08:26:39 compute-1 sudo[84139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:39 compute-1 python3.9[84141]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:26:39 compute-1 sudo[84139]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:40 compute-1 sudo[84291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzxzkuomqcjahlossrpdtobtormgolre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416000.1132052-849-53944431917229/AnsiballZ_stat.py'
Jan 26 08:26:40 compute-1 sudo[84291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:40 compute-1 python3.9[84293]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:40 compute-1 sudo[84291]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:41 compute-1 sudo[84414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lshajhyyeuchoyopgjxvxkpreivndezv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416000.1132052-849-53944431917229/AnsiballZ_copy.py'
Jan 26 08:26:41 compute-1 sudo[84414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:41 compute-1 python3.9[84416]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416000.1132052-849-53944431917229/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=70539153ba8394cf6e23efa475ecc78b911f2b37 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:41 compute-1 sudo[84414]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:41 compute-1 sudo[84566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egoecwauywsrsmvyrcshalsajoocfcem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416001.5786583-882-232037042063327/AnsiballZ_file.py'
Jan 26 08:26:41 compute-1 sudo[84566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:42 compute-1 python3.9[84568]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:26:42 compute-1 sudo[84566]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:42 compute-1 sudo[84718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhcrjnhyzdtingydpacrjtafhleuepwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416002.3552535-898-273013473662487/AnsiballZ_stat.py'
Jan 26 08:26:42 compute-1 sudo[84718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:42 compute-1 python3.9[84720]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:26:42 compute-1 sudo[84718]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:43 compute-1 sudo[84841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtvphszekcxrvazkwedzpanohhrzfeha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416002.3552535-898-273013473662487/AnsiballZ_copy.py'
Jan 26 08:26:43 compute-1 sudo[84841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:43 compute-1 python3.9[84843]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416002.3552535-898-273013473662487/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=70539153ba8394cf6e23efa475ecc78b911f2b37 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:26:43 compute-1 sudo[84841]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:43 compute-1 sshd-session[77184]: Connection closed by 192.168.122.30 port 51808
Jan 26 08:26:43 compute-1 sshd-session[77181]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:26:43 compute-1 systemd[1]: session-20.scope: Deactivated successfully.
Jan 26 08:26:43 compute-1 systemd[1]: session-20.scope: Consumed 36.096s CPU time.
Jan 26 08:26:43 compute-1 systemd-logind[788]: Session 20 logged out. Waiting for processes to exit.
Jan 26 08:26:43 compute-1 systemd-logind[788]: Removed session 20.
Jan 26 08:26:49 compute-1 sshd-session[84868]: Accepted publickey for zuul from 192.168.122.30 port 41144 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:26:49 compute-1 systemd-logind[788]: New session 21 of user zuul.
Jan 26 08:26:49 compute-1 systemd[1]: Started Session 21 of User zuul.
Jan 26 08:26:49 compute-1 sshd-session[84868]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:26:50 compute-1 python3.9[85021]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:26:51 compute-1 sudo[85175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwoicxvpvnulmxppqlkikebzcjjugvyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416011.2521806-44-61604610968145/AnsiballZ_file.py'
Jan 26 08:26:51 compute-1 sudo[85175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:51 compute-1 python3.9[85177]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:26:51 compute-1 sudo[85175]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:52 compute-1 sudo[85327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykyzfmeapvtkdmnlgmajenqllqorupui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416012.112279-44-62750467558558/AnsiballZ_file.py'
Jan 26 08:26:52 compute-1 sudo[85327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:52 compute-1 python3.9[85329]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:26:52 compute-1 sudo[85327]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:53 compute-1 python3.9[85479]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:26:54 compute-1 sudo[85629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yapuonkgusqgdhebjwaqexlbhvwtasuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416013.7800074-90-276566991484854/AnsiballZ_seboolean.py'
Jan 26 08:26:54 compute-1 sudo[85629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:54 compute-1 python3.9[85631]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 26 08:26:55 compute-1 sudo[85629]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:56 compute-1 sudo[85785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axiwowkooyrdeuelxifprejdenyfcxnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416016.1576664-110-251827655714259/AnsiballZ_setup.py'
Jan 26 08:26:56 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 26 08:26:56 compute-1 sudo[85785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:56 compute-1 python3.9[85787]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 08:26:57 compute-1 sudo[85785]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:57 compute-1 sudo[85869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsddmgzpjfgoqfrlkakffhaoejseiuur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416016.1576664-110-251827655714259/AnsiballZ_dnf.py'
Jan 26 08:26:57 compute-1 sudo[85869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:26:57 compute-1 python3.9[85871]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 08:26:59 compute-1 sudo[85869]: pam_unix(sudo:session): session closed for user root
Jan 26 08:26:59 compute-1 sudo[86022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vifgmrlqbhvxttdnzcjvaqveddqjuvcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416019.3273358-134-264505995819824/AnsiballZ_systemd.py'
Jan 26 08:26:59 compute-1 sudo[86022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:00 compute-1 python3.9[86024]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 08:27:00 compute-1 sudo[86022]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:01 compute-1 sudo[86177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkfauwcxszrvugnbtxmjettpghjdrwxq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769416020.6073694-150-78368899064153/AnsiballZ_edpm_nftables_snippet.py'
Jan 26 08:27:01 compute-1 sudo[86177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:01 compute-1 python3[86179]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 26 08:27:01 compute-1 sudo[86177]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:01 compute-1 sudo[86329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebzvnbncecogluomzqgusbwaybuzikww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416021.6197202-168-236302892873570/AnsiballZ_file.py'
Jan 26 08:27:01 compute-1 sudo[86329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:02 compute-1 python3.9[86331]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:02 compute-1 sudo[86329]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:02 compute-1 sudo[86481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bssrtbtjmernacfsgmyjtxinwcuyhgio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416022.2974367-184-36540201328903/AnsiballZ_stat.py'
Jan 26 08:27:02 compute-1 sudo[86481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:02 compute-1 python3.9[86483]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:27:02 compute-1 sudo[86481]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:03 compute-1 sudo[86559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dncikswtxcuiuplypzdgsuodujpdidxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416022.2974367-184-36540201328903/AnsiballZ_file.py'
Jan 26 08:27:03 compute-1 sudo[86559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:03 compute-1 python3.9[86561]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:03 compute-1 sudo[86559]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:04 compute-1 sudo[86711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etaqhxdvtpjldpozlpckdfbrlzqnoryf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416023.6850958-208-36670115747430/AnsiballZ_stat.py'
Jan 26 08:27:04 compute-1 sudo[86711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:04 compute-1 python3.9[86713]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:27:04 compute-1 sudo[86711]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:04 compute-1 sudo[86789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbfdtymqjjtrnclymstkihpmreplbsof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416023.6850958-208-36670115747430/AnsiballZ_file.py'
Jan 26 08:27:04 compute-1 sudo[86789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:04 compute-1 python3.9[86791]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.xyd_bojc recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:04 compute-1 sudo[86789]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:05 compute-1 sudo[86941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtekkrdebqmrukhtvfsiporiddxyuxhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416025.1468399-232-63269637955478/AnsiballZ_stat.py'
Jan 26 08:27:05 compute-1 sudo[86941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:05 compute-1 python3.9[86943]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:27:05 compute-1 sudo[86941]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:06 compute-1 sudo[87019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lccphmpikxisjhjnzmohfojttmswtajw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416025.1468399-232-63269637955478/AnsiballZ_file.py'
Jan 26 08:27:06 compute-1 sudo[87019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:06 compute-1 python3.9[87021]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:06 compute-1 sudo[87019]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:06 compute-1 sudo[87171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emzoeibbrccyzmgdaivewvebfzwowlbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416026.4885995-258-263011985495726/AnsiballZ_command.py'
Jan 26 08:27:06 compute-1 sudo[87171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:07 compute-1 python3.9[87173]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:27:07 compute-1 sudo[87171]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:07 compute-1 sudo[87324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjamizcibdoufbcgpefmyqchzlpyuxli ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769416027.4753358-274-121357637218928/AnsiballZ_edpm_nftables_from_files.py'
Jan 26 08:27:07 compute-1 sudo[87324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:08 compute-1 python3[87326]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 08:27:08 compute-1 sudo[87324]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:08 compute-1 sudo[87476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wisnypelawhbopcsxglrwdxkalzgcdnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416028.5928333-290-133034097612699/AnsiballZ_stat.py'
Jan 26 08:27:08 compute-1 sudo[87476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:09 compute-1 python3.9[87478]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:27:09 compute-1 sudo[87476]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:09 compute-1 sudo[87601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cptesojtihoeelmllsfxamlddzvsgkke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416028.5928333-290-133034097612699/AnsiballZ_copy.py'
Jan 26 08:27:09 compute-1 sudo[87601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:09 compute-1 python3.9[87603]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416028.5928333-290-133034097612699/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:09 compute-1 sudo[87601]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:10 compute-1 sudo[87753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqprdnmqmjtvubncmwclydiojbaesxqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416030.1867275-320-242579868682165/AnsiballZ_stat.py'
Jan 26 08:27:10 compute-1 sudo[87753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:10 compute-1 python3.9[87755]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:27:10 compute-1 sudo[87753]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:11 compute-1 sudo[87878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqyozzapdorrzjedkyqmsalpqumahzyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416030.1867275-320-242579868682165/AnsiballZ_copy.py'
Jan 26 08:27:11 compute-1 sudo[87878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:11 compute-1 python3.9[87880]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416030.1867275-320-242579868682165/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:11 compute-1 sudo[87878]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:12 compute-1 sudo[88030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llhdlnjxltbskttbzgeslnkiksmopzjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416031.8452983-350-197336806460827/AnsiballZ_stat.py'
Jan 26 08:27:12 compute-1 sudo[88030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:12 compute-1 python3.9[88032]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:27:12 compute-1 sudo[88030]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:12 compute-1 sudo[88155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxffguianpfpoysecodbukmrlhllpnqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416031.8452983-350-197336806460827/AnsiballZ_copy.py'
Jan 26 08:27:12 compute-1 sudo[88155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:13 compute-1 python3.9[88157]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416031.8452983-350-197336806460827/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:13 compute-1 sudo[88155]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:13 compute-1 sudo[88307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgnvfjlcmajfhnvzgcknabxphikhrvjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416033.4997687-380-160798922444090/AnsiballZ_stat.py'
Jan 26 08:27:13 compute-1 sudo[88307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:14 compute-1 python3.9[88309]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:27:14 compute-1 sudo[88307]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:14 compute-1 sudo[88432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guueobgggcwkgpajmlboodmxtswvxacq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416033.4997687-380-160798922444090/AnsiballZ_copy.py'
Jan 26 08:27:14 compute-1 sudo[88432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:14 compute-1 python3.9[88434]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416033.4997687-380-160798922444090/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:14 compute-1 sudo[88432]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:15 compute-1 sudo[88584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vusrkzzcylxengpwbyozsbkvcbrhtstc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416035.1108267-410-126317027536157/AnsiballZ_stat.py'
Jan 26 08:27:15 compute-1 sudo[88584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:15 compute-1 python3.9[88586]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:27:15 compute-1 sudo[88584]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:16 compute-1 sudo[88709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqsldpsthandjdvzagqasqaetduabyih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416035.1108267-410-126317027536157/AnsiballZ_copy.py'
Jan 26 08:27:16 compute-1 sudo[88709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:16 compute-1 python3.9[88711]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416035.1108267-410-126317027536157/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:16 compute-1 sudo[88709]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:17 compute-1 sudo[88861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlaooifgfetelntaqaawrlgxevhykiwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416036.7353196-440-120122955617570/AnsiballZ_file.py'
Jan 26 08:27:17 compute-1 sudo[88861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:17 compute-1 python3.9[88863]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:17 compute-1 sudo[88861]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:17 compute-1 sudo[89013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avmjeaxfjaadcxglkctfnceocpptflpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416037.540998-456-114925522940790/AnsiballZ_command.py'
Jan 26 08:27:17 compute-1 sudo[89013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:18 compute-1 python3.9[89015]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:27:18 compute-1 sudo[89013]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:18 compute-1 sudo[89168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvjbpzaiukpvuxnflgffinbzclhhnpji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416038.482071-472-89203316694308/AnsiballZ_blockinfile.py'
Jan 26 08:27:18 compute-1 sudo[89168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:19 compute-1 python3.9[89170]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:19 compute-1 sudo[89168]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:19 compute-1 sudo[89320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjoqzcvrsspwqsszkujqmxrnxkghhvio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416039.5149913-490-27651331124337/AnsiballZ_command.py'
Jan 26 08:27:19 compute-1 sudo[89320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:20 compute-1 python3.9[89322]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:27:20 compute-1 sudo[89320]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:20 compute-1 sudo[89473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzlgkbwvbiulnodlbqykpcyaqujkrltz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416040.315632-506-188217256939049/AnsiballZ_stat.py'
Jan 26 08:27:20 compute-1 sudo[89473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:20 compute-1 python3.9[89475]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:27:20 compute-1 sudo[89473]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:21 compute-1 sudo[89627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxbuwgpptlvbiqubcyzrsgjubgzzjgzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416041.0966651-522-213333753833817/AnsiballZ_command.py'
Jan 26 08:27:21 compute-1 sudo[89627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:21 compute-1 python3.9[89629]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:27:21 compute-1 sudo[89627]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:22 compute-1 sudo[89782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uehtlocvpbenfiwtnlhxpxmagedhcbmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416041.8990052-538-65858095917336/AnsiballZ_file.py'
Jan 26 08:27:22 compute-1 sudo[89782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:22 compute-1 python3.9[89784]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:22 compute-1 sudo[89782]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:23 compute-1 python3.9[89934]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:27:24 compute-1 sudo[90085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdfrjpvcfiagxsbkeqrbjlrytcjimxzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416044.3526962-618-252109359536853/AnsiballZ_command.py'
Jan 26 08:27:24 compute-1 sudo[90085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:24 compute-1 python3.9[90087]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:27:24 compute-1 ovs-vsctl[90088]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 26 08:27:24 compute-1 sudo[90085]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:25 compute-1 sudo[90238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuibhqicrcyvnqkbngthigwwegeiydte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416045.1963217-636-244499391659609/AnsiballZ_command.py'
Jan 26 08:27:25 compute-1 sudo[90238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:25 compute-1 python3.9[90240]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:27:25 compute-1 sudo[90238]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:26 compute-1 sudo[90393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obyhchguytpwaqlwfjucpiyxrayccgzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416046.0278728-652-239172507907651/AnsiballZ_command.py'
Jan 26 08:27:26 compute-1 sudo[90393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:26 compute-1 python3.9[90395]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:27:26 compute-1 ovs-vsctl[90396]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 26 08:27:26 compute-1 sudo[90393]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:27 compute-1 python3.9[90546]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:27:28 compute-1 sudo[90698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szpfvkrnlobocakfedqiauhaletcluli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416047.711163-686-53153499522613/AnsiballZ_file.py'
Jan 26 08:27:28 compute-1 sudo[90698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:28 compute-1 python3.9[90700]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:27:28 compute-1 sudo[90698]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:29 compute-1 sudo[90850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhnzrwbspdrjlasjtblvngfadvgiwepk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416048.7548833-702-50587766503525/AnsiballZ_stat.py'
Jan 26 08:27:29 compute-1 sudo[90850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:29 compute-1 python3.9[90852]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:27:29 compute-1 sudo[90850]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:29 compute-1 sudo[90928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viwweyjhhimlezqikrutqccbdrnrjjlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416048.7548833-702-50587766503525/AnsiballZ_file.py'
Jan 26 08:27:29 compute-1 sudo[90928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:29 compute-1 python3.9[90930]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:27:29 compute-1 sudo[90928]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:30 compute-1 sudo[91080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oswzbppryvzuizrpkaqrgevopyqpyyut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416049.9938204-702-275576617982292/AnsiballZ_stat.py'
Jan 26 08:27:30 compute-1 sudo[91080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:30 compute-1 python3.9[91082]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:27:30 compute-1 sudo[91080]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:30 compute-1 sudo[91158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-immbvrwczkainpcuxowiqdxxwylalcws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416049.9938204-702-275576617982292/AnsiballZ_file.py'
Jan 26 08:27:30 compute-1 sudo[91158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:31 compute-1 python3.9[91160]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:27:31 compute-1 sudo[91158]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:31 compute-1 sudo[91310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjjlfcglnrhonstqkprlzisnotjuoqwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416051.2499568-748-135291640430019/AnsiballZ_file.py'
Jan 26 08:27:31 compute-1 sudo[91310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:31 compute-1 python3.9[91312]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:31 compute-1 sudo[91310]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:32 compute-1 sudo[91462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vybjqoneuylbhxeijeedytmswazzohsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416051.9606645-764-97858583149081/AnsiballZ_stat.py'
Jan 26 08:27:32 compute-1 sudo[91462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:32 compute-1 python3.9[91464]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:27:32 compute-1 sudo[91462]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:32 compute-1 sudo[91540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edtocsfxrlxrighoblowxevmhuplfjgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416051.9606645-764-97858583149081/AnsiballZ_file.py'
Jan 26 08:27:32 compute-1 sudo[91540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:33 compute-1 python3.9[91542]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:33 compute-1 sudo[91540]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:33 compute-1 sudo[91692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqfmmmtooscuxparrzgtptytzajhphyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416053.3833451-788-79036763371128/AnsiballZ_stat.py'
Jan 26 08:27:33 compute-1 sudo[91692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:33 compute-1 python3.9[91694]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:27:33 compute-1 sudo[91692]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:34 compute-1 sudo[91770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuwqncldopnlcmszewfkbxwdorgydoxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416053.3833451-788-79036763371128/AnsiballZ_file.py'
Jan 26 08:27:34 compute-1 sudo[91770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:34 compute-1 python3.9[91772]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:34 compute-1 sudo[91770]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:34 compute-1 sudo[91922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rphbehdnxbwisanslvjdtuscddfstmgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416054.586586-812-87438217545211/AnsiballZ_systemd.py'
Jan 26 08:27:34 compute-1 sudo[91922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:35 compute-1 python3.9[91924]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:27:35 compute-1 systemd[1]: Reloading.
Jan 26 08:27:35 compute-1 systemd-rc-local-generator[91952]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:27:35 compute-1 systemd-sysv-generator[91956]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:27:35 compute-1 sudo[91922]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:36 compute-1 sudo[92111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cknaznvgreuksswwpvfxkqmfyshjpinv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416055.8497682-828-181857347233311/AnsiballZ_stat.py'
Jan 26 08:27:36 compute-1 sudo[92111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:36 compute-1 python3.9[92113]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:27:36 compute-1 sudo[92111]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:36 compute-1 sudo[92189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvyqqtuldbmohbbvucoxwwzepinqemrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416055.8497682-828-181857347233311/AnsiballZ_file.py'
Jan 26 08:27:36 compute-1 sudo[92189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:36 compute-1 python3.9[92191]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:37 compute-1 sudo[92189]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:37 compute-1 sudo[92341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxvucrqrocnniqjbstyfuedrwqfirehz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416057.2027168-852-147087943164751/AnsiballZ_stat.py'
Jan 26 08:27:37 compute-1 sudo[92341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:37 compute-1 python3.9[92343]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:27:37 compute-1 sudo[92341]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:38 compute-1 sudo[92419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbhoyswlcxypqcfifaxevobmumrctxbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416057.2027168-852-147087943164751/AnsiballZ_file.py'
Jan 26 08:27:38 compute-1 sudo[92419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:38 compute-1 python3.9[92421]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:38 compute-1 sudo[92419]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:39 compute-1 sudo[92571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otxbrxoprfhrnbnxesxactcctxgoaosf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416058.6848426-876-230051512191065/AnsiballZ_systemd.py'
Jan 26 08:27:39 compute-1 sudo[92571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:39 compute-1 python3.9[92573]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:27:39 compute-1 systemd[1]: Reloading.
Jan 26 08:27:39 compute-1 systemd-rc-local-generator[92603]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:27:39 compute-1 systemd-sysv-generator[92606]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:27:39 compute-1 systemd[1]: Starting Create netns directory...
Jan 26 08:27:39 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 08:27:39 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 08:27:39 compute-1 systemd[1]: Finished Create netns directory.
Jan 26 08:27:39 compute-1 sudo[92571]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:40 compute-1 sudo[92767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tziitjamgtjvvggbnnelunxpsibpwfvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416059.9991453-896-280051740367264/AnsiballZ_file.py'
Jan 26 08:27:40 compute-1 sudo[92767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:40 compute-1 python3.9[92769]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:27:40 compute-1 sudo[92767]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:41 compute-1 sudo[92919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mshnksvlfelvfkeynuxorqjafbmxhzgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416060.6998277-912-124761255377353/AnsiballZ_stat.py'
Jan 26 08:27:41 compute-1 sudo[92919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:41 compute-1 python3.9[92921]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:27:41 compute-1 sudo[92919]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:41 compute-1 sudo[93042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riqkvcxmxlotfsgwalstracibmqyvmfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416060.6998277-912-124761255377353/AnsiballZ_copy.py'
Jan 26 08:27:41 compute-1 sudo[93042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:41 compute-1 python3.9[93044]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416060.6998277-912-124761255377353/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:27:41 compute-1 sudo[93042]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:42 compute-1 sudo[93194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwsjahrahxuoigxbymjbauqhmzqjhrgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416062.4011433-946-29380917327708/AnsiballZ_file.py'
Jan 26 08:27:42 compute-1 sudo[93194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:42 compute-1 python3.9[93196]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:42 compute-1 sudo[93194]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:43 compute-1 sudo[93346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftmqgyhmiuasocdilhtiicnfxbocgpnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416063.204937-962-61307575930202/AnsiballZ_file.py'
Jan 26 08:27:43 compute-1 sudo[93346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:43 compute-1 python3.9[93348]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:27:44 compute-1 sudo[93346]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:44 compute-1 sudo[93498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yinymirgtecpmavqwgmgytgrxkiayilk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416064.2370307-978-274338964554909/AnsiballZ_stat.py'
Jan 26 08:27:44 compute-1 sudo[93498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:45 compute-1 python3.9[93500]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:27:45 compute-1 sudo[93498]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:45 compute-1 sudo[93621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ardsceoalgwanwigkndqeogfvtypxolq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416064.2370307-978-274338964554909/AnsiballZ_copy.py'
Jan 26 08:27:45 compute-1 sudo[93621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:45 compute-1 python3.9[93623]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416064.2370307-978-274338964554909/.source.json _original_basename=.ghpsa_37 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:45 compute-1 sudo[93621]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:46 compute-1 python3.9[93773]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:49 compute-1 sudo[94194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqpxqrjemqaheodglwwllluptwingflq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416068.3676634-1058-239339064362846/AnsiballZ_container_config_data.py'
Jan 26 08:27:49 compute-1 sudo[94194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:49 compute-1 python3.9[94196]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 26 08:27:49 compute-1 sudo[94194]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:50 compute-1 sudo[94346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqtgjqfqqgmvzmreoyywbwlyavogerem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416070.0136359-1080-56298183964721/AnsiballZ_container_config_hash.py'
Jan 26 08:27:50 compute-1 sudo[94346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:50 compute-1 python3.9[94348]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 08:27:50 compute-1 sudo[94346]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:51 compute-1 sudo[94498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htrqdnzjpdksudaxjrzrwuivfcsmizzl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769416071.0929387-1100-30448864376984/AnsiballZ_edpm_container_manage.py'
Jan 26 08:27:51 compute-1 sudo[94498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:51 compute-1 python3[94500]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 08:27:52 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:27:52 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:27:52 compute-1 podman[94537]: 2026-01-26 08:27:52.232207187 +0000 UTC m=+0.076406277 container create 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 26 08:27:52 compute-1 podman[94537]: 2026-01-26 08:27:52.194835707 +0000 UTC m=+0.039034837 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 26 08:27:52 compute-1 python3[94500]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 26 08:27:52 compute-1 sudo[94498]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:52 compute-1 sudo[94725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afieacxwxcguzubpsqmqubzdupwfllrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416072.6367357-1116-70129098877962/AnsiballZ_stat.py'
Jan 26 08:27:52 compute-1 sudo[94725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:53 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 08:27:53 compute-1 python3.9[94727]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:27:53 compute-1 sudo[94725]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:53 compute-1 sudo[94879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmkeanfvpfdwmlwfvlddpwqohonbyxln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416073.4948194-1134-205813951197985/AnsiballZ_file.py'
Jan 26 08:27:53 compute-1 sudo[94879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:54 compute-1 python3.9[94881]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:54 compute-1 sudo[94879]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:54 compute-1 sudo[94955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwcmyostvckxirmgengjbclhlphosgks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416073.4948194-1134-205813951197985/AnsiballZ_stat.py'
Jan 26 08:27:54 compute-1 sudo[94955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:54 compute-1 python3.9[94957]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:27:54 compute-1 sudo[94955]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:55 compute-1 sudo[95106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czrwoccnosrzshkuwfxzdyaybpnpazzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416074.725753-1134-128849821690812/AnsiballZ_copy.py'
Jan 26 08:27:55 compute-1 sudo[95106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:55 compute-1 python3.9[95108]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769416074.725753-1134-128849821690812/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:27:55 compute-1 sudo[95106]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:55 compute-1 sudo[95182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqbjclfratdhnciiwarpkibehyuwpdzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416074.725753-1134-128849821690812/AnsiballZ_systemd.py'
Jan 26 08:27:55 compute-1 sudo[95182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:56 compute-1 python3.9[95184]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 08:27:56 compute-1 systemd[1]: Reloading.
Jan 26 08:27:56 compute-1 systemd-rc-local-generator[95206]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:27:56 compute-1 systemd-sysv-generator[95213]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:27:56 compute-1 sudo[95182]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:56 compute-1 sudo[95294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaqtbmqlotygbulzpiqfolkqpdzksurq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416074.725753-1134-128849821690812/AnsiballZ_systemd.py'
Jan 26 08:27:56 compute-1 sudo[95294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:27:57 compute-1 python3.9[95296]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:27:57 compute-1 systemd[1]: Reloading.
Jan 26 08:27:57 compute-1 systemd-rc-local-generator[95326]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:27:57 compute-1 systemd-sysv-generator[95330]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:27:57 compute-1 systemd[1]: Starting ovn_controller container...
Jan 26 08:27:57 compute-1 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 26 08:27:57 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:27:57 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/465b1419f1ece0689f7d5922af80e71a46f896d3392aa45a31837971c3bc9ae3/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 26 08:27:57 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0.
Jan 26 08:27:57 compute-1 podman[95337]: 2026-01-26 08:27:57.645254535 +0000 UTC m=+0.173364299 container init 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 08:27:57 compute-1 ovn_controller[95352]: + sudo -E kolla_set_configs
Jan 26 08:27:57 compute-1 podman[95337]: 2026-01-26 08:27:57.699258706 +0000 UTC m=+0.227368420 container start 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true)
Jan 26 08:27:57 compute-1 edpm-start-podman-container[95337]: ovn_controller
Jan 26 08:27:57 compute-1 systemd[1]: Created slice User Slice of UID 0.
Jan 26 08:27:57 compute-1 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 26 08:27:57 compute-1 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 26 08:27:57 compute-1 systemd[1]: Starting User Manager for UID 0...
Jan 26 08:27:57 compute-1 systemd[95386]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Jan 26 08:27:57 compute-1 edpm-start-podman-container[95336]: Creating additional drop-in dependency for "ovn_controller" (17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0)
Jan 26 08:27:57 compute-1 podman[95359]: 2026-01-26 08:27:57.830569607 +0000 UTC m=+0.107926590 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 08:27:57 compute-1 systemd[1]: 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0-51c4507f1d723710.service: Main process exited, code=exited, status=1/FAILURE
Jan 26 08:27:57 compute-1 systemd[1]: 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0-51c4507f1d723710.service: Failed with result 'exit-code'.
Jan 26 08:27:57 compute-1 systemd[1]: Reloading.
Jan 26 08:27:57 compute-1 systemd[95386]: Queued start job for default target Main User Target.
Jan 26 08:27:57 compute-1 systemd[95386]: Created slice User Application Slice.
Jan 26 08:27:57 compute-1 systemd[95386]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 26 08:27:57 compute-1 systemd[95386]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 08:27:57 compute-1 systemd[95386]: Reached target Paths.
Jan 26 08:27:57 compute-1 systemd[95386]: Reached target Timers.
Jan 26 08:27:57 compute-1 systemd-sysv-generator[95446]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:27:57 compute-1 systemd-rc-local-generator[95442]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:27:57 compute-1 systemd[95386]: Starting D-Bus User Message Bus Socket...
Jan 26 08:27:57 compute-1 systemd[95386]: Starting Create User's Volatile Files and Directories...
Jan 26 08:27:57 compute-1 systemd[95386]: Listening on D-Bus User Message Bus Socket.
Jan 26 08:27:57 compute-1 systemd[95386]: Reached target Sockets.
Jan 26 08:27:57 compute-1 systemd[95386]: Finished Create User's Volatile Files and Directories.
Jan 26 08:27:57 compute-1 systemd[95386]: Reached target Basic System.
Jan 26 08:27:57 compute-1 systemd[95386]: Reached target Main User Target.
Jan 26 08:27:57 compute-1 systemd[95386]: Startup finished in 147ms.
Jan 26 08:27:58 compute-1 systemd[1]: Started User Manager for UID 0.
Jan 26 08:27:58 compute-1 systemd[1]: Started ovn_controller container.
Jan 26 08:27:58 compute-1 systemd[1]: Started Session c1 of User root.
Jan 26 08:27:58 compute-1 sudo[95294]: pam_unix(sudo:session): session closed for user root
Jan 26 08:27:58 compute-1 ovn_controller[95352]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 08:27:58 compute-1 ovn_controller[95352]: INFO:__main__:Validating config file
Jan 26 08:27:58 compute-1 ovn_controller[95352]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 08:27:58 compute-1 ovn_controller[95352]: INFO:__main__:Writing out command to execute
Jan 26 08:27:58 compute-1 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 26 08:27:58 compute-1 ovn_controller[95352]: ++ cat /run_command
Jan 26 08:27:58 compute-1 ovn_controller[95352]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 26 08:27:58 compute-1 ovn_controller[95352]: + ARGS=
Jan 26 08:27:58 compute-1 ovn_controller[95352]: + sudo kolla_copy_cacerts
Jan 26 08:27:58 compute-1 systemd[1]: Started Session c2 of User root.
Jan 26 08:27:58 compute-1 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 26 08:27:58 compute-1 ovn_controller[95352]: + [[ ! -n '' ]]
Jan 26 08:27:58 compute-1 ovn_controller[95352]: + . kolla_extend_start
Jan 26 08:27:58 compute-1 ovn_controller[95352]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 26 08:27:58 compute-1 ovn_controller[95352]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 26 08:27:58 compute-1 ovn_controller[95352]: + umask 0022
Jan 26 08:27:58 compute-1 ovn_controller[95352]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 26 08:27:58 compute-1 NetworkManager[55451]: <info>  [1769416078.3682] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 26 08:27:58 compute-1 NetworkManager[55451]: <info>  [1769416078.3693] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 08:27:58 compute-1 NetworkManager[55451]: <warn>  [1769416078.3697] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 08:27:58 compute-1 NetworkManager[55451]: <info>  [1769416078.3708] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Jan 26 08:27:58 compute-1 NetworkManager[55451]: <info>  [1769416078.3717] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Jan 26 08:27:58 compute-1 NetworkManager[55451]: <info>  [1769416078.3723] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 26 08:27:58 compute-1 kernel: br-int: entered promiscuous mode
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00019|main|INFO|OVS feature set changed, force recompute.
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00021|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00022|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00001|statctrl(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00002|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 08:27:58 compute-1 ovn_controller[95352]: 2026-01-26T08:27:58Z|00003|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 08:27:58 compute-1 NetworkManager[55451]: <info>  [1769416078.4062] manager: (ovn-b3ed04-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 26 08:27:58 compute-1 NetworkManager[55451]: <info>  [1769416078.4076] manager: (ovn-b62a0d-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Jan 26 08:27:58 compute-1 systemd-udevd[95489]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:27:58 compute-1 kernel: genev_sys_6081: entered promiscuous mode
Jan 26 08:27:58 compute-1 systemd-udevd[95490]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:27:58 compute-1 NetworkManager[55451]: <info>  [1769416078.4354] device (genev_sys_6081): carrier: link connected
Jan 26 08:27:58 compute-1 NetworkManager[55451]: <info>  [1769416078.4363] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Jan 26 08:27:59 compute-1 python3.9[95620]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 08:28:00 compute-1 sudo[95770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raymtnexsqlgxcxltqoljnibasulayjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416079.7591114-1224-251189367496555/AnsiballZ_stat.py'
Jan 26 08:28:00 compute-1 sudo[95770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:00 compute-1 python3.9[95772]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:28:00 compute-1 sudo[95770]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:00 compute-1 sudo[95893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bspocmmjufgepyxniwhkzpcdfusptfrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416079.7591114-1224-251189367496555/AnsiballZ_copy.py'
Jan 26 08:28:00 compute-1 sudo[95893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:00 compute-1 python3.9[95895]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416079.7591114-1224-251189367496555/.source.yaml _original_basename=.bz67tjdp follow=False checksum=3d3c50e6cf00bb55760fe1300c70bcbaf7c35a93 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:28:00 compute-1 sudo[95893]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:01 compute-1 sudo[96046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqxredhtjngbeaaiqhqhludpeepfilyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416081.096065-1254-85064635398835/AnsiballZ_command.py'
Jan 26 08:28:01 compute-1 sudo[96046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:01 compute-1 python3.9[96048]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:28:01 compute-1 ovs-vsctl[96049]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 26 08:28:01 compute-1 sudo[96046]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:02 compute-1 sudo[96199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbgixhbivqvcbhmaekyoflntfuxiwvfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416081.9152281-1270-137495307073673/AnsiballZ_command.py'
Jan 26 08:28:02 compute-1 sudo[96199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:02 compute-1 python3.9[96201]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:28:02 compute-1 ovs-vsctl[96203]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 26 08:28:02 compute-1 sudo[96199]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:03 compute-1 sudo[96354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkmmdanwqbsrmioqzdtvljrntgkjsnbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416082.8325741-1298-113448415503701/AnsiballZ_command.py'
Jan 26 08:28:03 compute-1 sudo[96354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:03 compute-1 python3.9[96356]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:28:03 compute-1 ovs-vsctl[96357]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 26 08:28:03 compute-1 sudo[96354]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:03 compute-1 sshd-session[84871]: Connection closed by 192.168.122.30 port 41144
Jan 26 08:28:03 compute-1 sshd-session[84868]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:28:03 compute-1 systemd[1]: session-21.scope: Deactivated successfully.
Jan 26 08:28:03 compute-1 systemd[1]: session-21.scope: Consumed 56.242s CPU time.
Jan 26 08:28:03 compute-1 systemd-logind[788]: Session 21 logged out. Waiting for processes to exit.
Jan 26 08:28:03 compute-1 systemd-logind[788]: Removed session 21.
Jan 26 08:28:08 compute-1 systemd[1]: Stopping User Manager for UID 0...
Jan 26 08:28:08 compute-1 systemd[95386]: Activating special unit Exit the Session...
Jan 26 08:28:08 compute-1 systemd[95386]: Stopped target Main User Target.
Jan 26 08:28:08 compute-1 systemd[95386]: Stopped target Basic System.
Jan 26 08:28:08 compute-1 systemd[95386]: Stopped target Paths.
Jan 26 08:28:08 compute-1 systemd[95386]: Stopped target Sockets.
Jan 26 08:28:08 compute-1 systemd[95386]: Stopped target Timers.
Jan 26 08:28:08 compute-1 systemd[95386]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 26 08:28:08 compute-1 systemd[95386]: Closed D-Bus User Message Bus Socket.
Jan 26 08:28:08 compute-1 systemd[95386]: Stopped Create User's Volatile Files and Directories.
Jan 26 08:28:08 compute-1 systemd[95386]: Removed slice User Application Slice.
Jan 26 08:28:08 compute-1 systemd[95386]: Reached target Shutdown.
Jan 26 08:28:08 compute-1 systemd[95386]: Finished Exit the Session.
Jan 26 08:28:08 compute-1 systemd[95386]: Reached target Exit the Session.
Jan 26 08:28:08 compute-1 systemd[1]: user@0.service: Deactivated successfully.
Jan 26 08:28:08 compute-1 systemd[1]: Stopped User Manager for UID 0.
Jan 26 08:28:08 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 26 08:28:08 compute-1 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 26 08:28:08 compute-1 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 26 08:28:08 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 26 08:28:08 compute-1 systemd[1]: Removed slice User Slice of UID 0.
Jan 26 08:28:09 compute-1 sshd-session[96384]: Accepted publickey for zuul from 192.168.122.30 port 38916 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:28:09 compute-1 systemd-logind[788]: New session 23 of user zuul.
Jan 26 08:28:09 compute-1 systemd[1]: Started Session 23 of User zuul.
Jan 26 08:28:09 compute-1 sshd-session[96384]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:28:10 compute-1 python3.9[96537]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:28:11 compute-1 sudo[96691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdxqubnxuxvezdggixgyddoozuqynbni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416090.9284441-44-60016601456785/AnsiballZ_file.py'
Jan 26 08:28:11 compute-1 sudo[96691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:11 compute-1 python3.9[96693]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:28:11 compute-1 sudo[96691]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:12 compute-1 sudo[96843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aahlphgrkqkjtwewtbwfwcnkummxihxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416091.9571724-44-246017292467478/AnsiballZ_file.py'
Jan 26 08:28:12 compute-1 sudo[96843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:12 compute-1 python3.9[96845]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:28:12 compute-1 sudo[96843]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:13 compute-1 sudo[96995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkzcioyyvfrsjzqfxjvweqvqgewpcnwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416092.72787-44-81848856296537/AnsiballZ_file.py'
Jan 26 08:28:13 compute-1 sudo[96995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:13 compute-1 python3.9[96997]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:28:13 compute-1 sudo[96995]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:13 compute-1 sudo[97147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yumakaolkqimqklhfceqmctrgdkmcvqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416093.4886346-44-1189604794279/AnsiballZ_file.py'
Jan 26 08:28:13 compute-1 sudo[97147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:14 compute-1 python3.9[97149]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:28:14 compute-1 sudo[97147]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:14 compute-1 sudo[97299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpmckffwfljspbjwfglmktgqtwbppubt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416094.2783685-44-153157301726147/AnsiballZ_file.py'
Jan 26 08:28:14 compute-1 sudo[97299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:14 compute-1 python3.9[97301]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:28:14 compute-1 sudo[97299]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:15 compute-1 python3.9[97451]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:28:16 compute-1 sudo[97602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjnzejklnhtechrmiqxlltzgiaalrszr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416096.1522224-132-61789662708729/AnsiballZ_seboolean.py'
Jan 26 08:28:16 compute-1 sudo[97602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:16 compute-1 python3.9[97604]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 26 08:28:17 compute-1 sudo[97602]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:18 compute-1 python3.9[97754]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:28:19 compute-1 python3.9[97875]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416097.833667-148-86208099140621/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:28:20 compute-1 python3.9[98025]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:28:20 compute-1 python3.9[98146]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416099.5961254-178-74576918426142/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:28:21 compute-1 sudo[98296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ychvcnbxiggtaivmsifdswjxqedjfqtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416101.3894129-212-121704926923410/AnsiballZ_setup.py'
Jan 26 08:28:21 compute-1 sudo[98296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:22 compute-1 python3.9[98298]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 08:28:22 compute-1 sudo[98296]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:22 compute-1 sudo[98380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uatqwackraxypmgcrfvkdyokufykcuou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416101.3894129-212-121704926923410/AnsiballZ_dnf.py'
Jan 26 08:28:22 compute-1 sudo[98380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:23 compute-1 python3.9[98382]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 08:28:24 compute-1 sudo[98380]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:25 compute-1 sudo[98533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdjgkgevocdhhgfanuxkjgvjbjdfizux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416104.672443-236-120301710735665/AnsiballZ_systemd.py'
Jan 26 08:28:25 compute-1 sudo[98533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:25 compute-1 python3.9[98535]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 08:28:25 compute-1 sudo[98533]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:26 compute-1 python3.9[98688]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:28:27 compute-1 python3.9[98809]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416106.023705-252-241819184375236/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:28:28 compute-1 python3.9[98959]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:28:28 compute-1 ovn_controller[95352]: 2026-01-26T08:28:28Z|00025|memory|INFO|16384 kB peak resident set size after 30.2 seconds
Jan 26 08:28:28 compute-1 ovn_controller[95352]: 2026-01-26T08:28:28Z|00026|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Jan 26 08:28:28 compute-1 podman[99054]: 2026-01-26 08:28:28.63791612 +0000 UTC m=+0.190885083 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:28:28 compute-1 python3.9[99091]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416107.4345703-252-182968685896009/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:28:30 compute-1 python3.9[99254]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:28:30 compute-1 python3.9[99375]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416109.5578372-340-141458761736997/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:28:31 compute-1 python3.9[99525]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:28:31 compute-1 python3.9[99646]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416110.87161-340-182401644670402/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:28:32 compute-1 python3.9[99796]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:28:33 compute-1 sudo[99948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrlkxjqmlcdbulcmpvggubfsctzrypvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416113.0606074-416-4312756947467/AnsiballZ_file.py'
Jan 26 08:28:33 compute-1 sudo[99948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:33 compute-1 python3.9[99950]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:28:33 compute-1 sudo[99948]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:34 compute-1 sudo[100100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwlhjpbeqosjpskgqobztxziazmtieth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416113.9051816-432-57016604184140/AnsiballZ_stat.py'
Jan 26 08:28:34 compute-1 sudo[100100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:34 compute-1 python3.9[100102]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:28:34 compute-1 sudo[100100]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:34 compute-1 sudo[100178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkbskfvwdkjngnsdkjccitwcvzxflmva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416113.9051816-432-57016604184140/AnsiballZ_file.py'
Jan 26 08:28:34 compute-1 sudo[100178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:34 compute-1 python3.9[100180]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:28:34 compute-1 sudo[100178]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:35 compute-1 sudo[100330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awqxmufhexhsuvpallqfodfjjnvpistn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416115.1073995-432-96555945960245/AnsiballZ_stat.py'
Jan 26 08:28:35 compute-1 sudo[100330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:35 compute-1 python3.9[100332]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:28:35 compute-1 sudo[100330]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:36 compute-1 sudo[100408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uruvcoygydlppuhucllqfgqdyhnmovyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416115.1073995-432-96555945960245/AnsiballZ_file.py'
Jan 26 08:28:36 compute-1 sudo[100408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:36 compute-1 python3.9[100410]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:28:36 compute-1 sudo[100408]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:36 compute-1 sudo[100560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpgiebrekepnmdorbfktyjfdbxsxelxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416116.4710772-478-7263882899821/AnsiballZ_file.py'
Jan 26 08:28:36 compute-1 sudo[100560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:37 compute-1 python3.9[100562]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:28:37 compute-1 sudo[100560]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:37 compute-1 sudo[100712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hefnfqmvaayotksbozgfxfrgkeahgmdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416117.2948773-494-244127502743142/AnsiballZ_stat.py'
Jan 26 08:28:37 compute-1 sudo[100712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:37 compute-1 python3.9[100714]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:28:37 compute-1 sudo[100712]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:38 compute-1 sudo[100790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqwzwzdydbmyvtunmyakkhfdjgoveegg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416117.2948773-494-244127502743142/AnsiballZ_file.py'
Jan 26 08:28:38 compute-1 sudo[100790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:38 compute-1 python3.9[100792]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:28:38 compute-1 sudo[100790]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:39 compute-1 sudo[100942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqqplzpfahxomywnswusbkqddaczyckd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416118.6434896-518-160907898793234/AnsiballZ_stat.py'
Jan 26 08:28:39 compute-1 sudo[100942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:39 compute-1 python3.9[100944]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:28:39 compute-1 sudo[100942]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:39 compute-1 sudo[101020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eogevkpamlaibdroknmqjdcyfopaxyqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416118.6434896-518-160907898793234/AnsiballZ_file.py'
Jan 26 08:28:39 compute-1 sudo[101020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:39 compute-1 python3.9[101022]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:28:39 compute-1 sudo[101020]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:40 compute-1 sudo[101172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdrhbphbctfmyafapuzspqkcwkxxcjcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416120.0396812-542-73201752930707/AnsiballZ_systemd.py'
Jan 26 08:28:40 compute-1 sudo[101172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:40 compute-1 python3.9[101174]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:28:40 compute-1 systemd[1]: Reloading.
Jan 26 08:28:40 compute-1 systemd-sysv-generator[101207]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:28:40 compute-1 systemd-rc-local-generator[101203]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:28:41 compute-1 sudo[101172]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:41 compute-1 sudo[101362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiwzjxvvgcqlghwbvsolvoomnphokyqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416121.264695-558-207244189689412/AnsiballZ_stat.py'
Jan 26 08:28:41 compute-1 sudo[101362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:41 compute-1 python3.9[101364]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:28:41 compute-1 sudo[101362]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:42 compute-1 sudo[101440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjxnwkfbiqrvqstbhnloftjaipikacju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416121.264695-558-207244189689412/AnsiballZ_file.py'
Jan 26 08:28:42 compute-1 sudo[101440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:42 compute-1 python3.9[101442]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:28:42 compute-1 sudo[101440]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:43 compute-1 sudo[101592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpvlwxumvxilnskasttoyazzwhoehtph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416122.7269917-582-95256306480299/AnsiballZ_stat.py'
Jan 26 08:28:43 compute-1 sudo[101592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:43 compute-1 python3.9[101594]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:28:43 compute-1 sudo[101592]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:43 compute-1 sudo[101670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqznflvsyyfffkrpqrpuqmwdnnhstbse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416122.7269917-582-95256306480299/AnsiballZ_file.py'
Jan 26 08:28:43 compute-1 sudo[101670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:43 compute-1 python3.9[101672]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:28:43 compute-1 sudo[101670]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:44 compute-1 sudo[101822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqrzldixbpqaabboxpuzfymfekuusizb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416124.0784469-606-98036281779619/AnsiballZ_systemd.py'
Jan 26 08:28:44 compute-1 sudo[101822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:44 compute-1 python3.9[101824]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:28:44 compute-1 systemd[1]: Reloading.
Jan 26 08:28:44 compute-1 systemd-rc-local-generator[101854]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:28:44 compute-1 systemd-sysv-generator[101857]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:28:46 compute-1 systemd[1]: Starting Create netns directory...
Jan 26 08:28:46 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 08:28:46 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 08:28:46 compute-1 systemd[1]: Finished Create netns directory.
Jan 26 08:28:46 compute-1 sudo[101822]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:46 compute-1 sudo[102015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suesxeqwbsczvhzbqkmyotnjjzawxkvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416126.4841177-626-20226464954421/AnsiballZ_file.py'
Jan 26 08:28:46 compute-1 sudo[102015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:47 compute-1 python3.9[102017]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:28:47 compute-1 sudo[102015]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:47 compute-1 sudo[102167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djsmtraoqocihwwdjgaoiejomecgkjxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416127.2942405-642-142660898226568/AnsiballZ_stat.py'
Jan 26 08:28:47 compute-1 sudo[102167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:47 compute-1 python3.9[102169]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:28:47 compute-1 sudo[102167]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:48 compute-1 sudo[102290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzasaqeajmcnmgomfeudfrfaivxgzrpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416127.2942405-642-142660898226568/AnsiballZ_copy.py'
Jan 26 08:28:48 compute-1 sudo[102290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:48 compute-1 python3.9[102292]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416127.2942405-642-142660898226568/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:28:48 compute-1 sudo[102290]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:49 compute-1 sudo[102442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tffjegdeeevjabzlqkiwuhbsftsqctxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416128.9332154-676-217154621981599/AnsiballZ_file.py'
Jan 26 08:28:49 compute-1 sudo[102442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:49 compute-1 python3.9[102444]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:28:49 compute-1 sudo[102442]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:50 compute-1 sudo[102594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmfrwxskgpvshvdemhdslihnyifvubaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416129.9653738-692-92103195187778/AnsiballZ_file.py'
Jan 26 08:28:50 compute-1 sudo[102594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:50 compute-1 python3.9[102596]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:28:50 compute-1 sudo[102594]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:51 compute-1 sudo[102746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syttedbttewpnsxgbvpbsqnwnltbuqru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416130.7693613-708-142763681282579/AnsiballZ_stat.py'
Jan 26 08:28:51 compute-1 sudo[102746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:51 compute-1 python3.9[102748]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:28:51 compute-1 sudo[102746]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:51 compute-1 sudo[102869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reuhgkxlambpioexcuuugnquvkohtedd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416130.7693613-708-142763681282579/AnsiballZ_copy.py'
Jan 26 08:28:51 compute-1 sudo[102869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:52 compute-1 python3.9[102871]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416130.7693613-708-142763681282579/.source.json _original_basename=.fpw746gb follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:28:52 compute-1 sudo[102869]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:52 compute-1 python3.9[103021]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:28:55 compute-1 sudo[103442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwvavwtddkfpaikafpgydabbmnnkqddz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416134.603302-788-18986467449815/AnsiballZ_container_config_data.py'
Jan 26 08:28:55 compute-1 sudo[103442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:55 compute-1 python3.9[103444]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 26 08:28:55 compute-1 sudo[103442]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:56 compute-1 sudo[103594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbieddxahlucasmdubuzqiebcooaemyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416135.7440276-810-29800834842649/AnsiballZ_container_config_hash.py'
Jan 26 08:28:56 compute-1 sudo[103594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:56 compute-1 python3.9[103596]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 08:28:56 compute-1 sudo[103594]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:57 compute-1 sudo[103746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utxzlimlazrpeofvuxywxuldtaxfcxak ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769416136.8593152-830-29660218292264/AnsiballZ_edpm_container_manage.py'
Jan 26 08:28:57 compute-1 sudo[103746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:57 compute-1 python3[103748]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 08:28:57 compute-1 podman[103785]: 2026-01-26 08:28:57.963216421 +0000 UTC m=+0.059386776 container create c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 08:28:57 compute-1 podman[103785]: 2026-01-26 08:28:57.934092482 +0000 UTC m=+0.030262857 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 08:28:57 compute-1 python3[103748]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 08:28:58 compute-1 sudo[103746]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:58 compute-1 sudo[103983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbbgilifazfogkmpduphfhsbedixnaud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416138.3971121-846-278417730042804/AnsiballZ_stat.py'
Jan 26 08:28:58 compute-1 sudo[103983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:58 compute-1 podman[103947]: 2026-01-26 08:28:58.843852466 +0000 UTC m=+0.145352354 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller)
Jan 26 08:28:58 compute-1 python3.9[103990]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:28:58 compute-1 sudo[103983]: pam_unix(sudo:session): session closed for user root
Jan 26 08:28:59 compute-1 sudo[104154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjforrhpxkqirbctnjpbkblslsbzvvdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416139.299662-864-93389122843234/AnsiballZ_file.py'
Jan 26 08:28:59 compute-1 sudo[104154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:28:59 compute-1 python3.9[104156]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:28:59 compute-1 sudo[104154]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:00 compute-1 sudo[104230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uidhqlhrsogkvaihzssnhotwwqerjzye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416139.299662-864-93389122843234/AnsiballZ_stat.py'
Jan 26 08:29:00 compute-1 sudo[104230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:00 compute-1 python3.9[104232]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:29:00 compute-1 sudo[104230]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:01 compute-1 sudo[104381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyeqyvffcfipkipxeodffernuvxqzrwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416140.5188394-864-221575791673536/AnsiballZ_copy.py'
Jan 26 08:29:01 compute-1 sudo[104381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:01 compute-1 python3.9[104383]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769416140.5188394-864-221575791673536/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:29:01 compute-1 sudo[104381]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:01 compute-1 sudo[104457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwyqcaltefcvmjxkxsaigqqhvakwmzyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416140.5188394-864-221575791673536/AnsiballZ_systemd.py'
Jan 26 08:29:01 compute-1 sudo[104457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:01 compute-1 python3.9[104459]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 08:29:01 compute-1 systemd[1]: Reloading.
Jan 26 08:29:01 compute-1 systemd-rc-local-generator[104483]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:29:01 compute-1 systemd-sysv-generator[104489]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:29:02 compute-1 sudo[104457]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:02 compute-1 sudo[104568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkfpcvtoiijfgfyxkoqguvwoqvhjofrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416140.5188394-864-221575791673536/AnsiballZ_systemd.py'
Jan 26 08:29:02 compute-1 sudo[104568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:02 compute-1 python3.9[104570]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:29:02 compute-1 systemd[1]: Reloading.
Jan 26 08:29:02 compute-1 systemd-sysv-generator[104601]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:29:02 compute-1 systemd-rc-local-generator[104598]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:29:03 compute-1 systemd[1]: Starting ovn_metadata_agent container...
Jan 26 08:29:03 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:29:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5fdaf2e91a663fabdbc65524015d65d79bff558a42e605356016ce4eb6a1a03/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 26 08:29:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5fdaf2e91a663fabdbc65524015d65d79bff558a42e605356016ce4eb6a1a03/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 08:29:03 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed.
Jan 26 08:29:03 compute-1 podman[104611]: 2026-01-26 08:29:03.361916562 +0000 UTC m=+0.187401591 container init c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: + sudo -E kolla_set_configs
Jan 26 08:29:03 compute-1 podman[104611]: 2026-01-26 08:29:03.405588476 +0000 UTC m=+0.231073505 container start c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: INFO:__main__:Validating config file
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: INFO:__main__:Copying service configuration files
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: INFO:__main__:Writing out command to execute
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: ++ cat /run_command
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: + CMD=neutron-ovn-metadata-agent
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: + ARGS=
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: + sudo kolla_copy_cacerts
Jan 26 08:29:03 compute-1 edpm-start-podman-container[104611]: ovn_metadata_agent
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: + [[ ! -n '' ]]
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: + . kolla_extend_start
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: Running command: 'neutron-ovn-metadata-agent'
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: + umask 0022
Jan 26 08:29:03 compute-1 ovn_metadata_agent[104627]: + exec neutron-ovn-metadata-agent
Jan 26 08:29:03 compute-1 podman[104634]: 2026-01-26 08:29:03.547420909 +0000 UTC m=+0.133290957 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 08:29:03 compute-1 edpm-start-podman-container[104610]: Creating additional drop-in dependency for "ovn_metadata_agent" (c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed)
Jan 26 08:29:03 compute-1 systemd[1]: Reloading.
Jan 26 08:29:03 compute-1 systemd-rc-local-generator[104703]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:29:03 compute-1 systemd-sysv-generator[104709]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:29:03 compute-1 systemd[1]: Started ovn_metadata_agent container.
Jan 26 08:29:04 compute-1 sudo[104568]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:04 compute-1 python3.9[104866]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.243 104632 INFO neutron.common.config [-] Logging enabled!
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.243 104632 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.243 104632 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.244 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.244 104632 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.244 104632 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.244 104632 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.244 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.244 104632 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.244 104632 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.245 104632 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.245 104632 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.245 104632 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.245 104632 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.245 104632 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.245 104632 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.245 104632 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.245 104632 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.245 104632 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.245 104632 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.246 104632 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.246 104632 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.246 104632 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.246 104632 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.246 104632 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.246 104632 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.246 104632 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.246 104632 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.246 104632 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.246 104632 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.247 104632 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.247 104632 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.247 104632 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.247 104632 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.247 104632 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.247 104632 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.247 104632 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.247 104632 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.247 104632 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.248 104632 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.248 104632 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.248 104632 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.248 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.248 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.248 104632 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.248 104632 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.248 104632 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.248 104632 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.248 104632 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.248 104632 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.249 104632 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.249 104632 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.249 104632 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.249 104632 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.249 104632 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.249 104632 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.249 104632 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.249 104632 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.249 104632 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.249 104632 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.250 104632 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.250 104632 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.250 104632 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.250 104632 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.250 104632 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.250 104632 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.250 104632 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.250 104632 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.250 104632 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.251 104632 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.251 104632 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.251 104632 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.251 104632 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.251 104632 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.251 104632 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.251 104632 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.251 104632 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.251 104632 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.251 104632 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.252 104632 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.252 104632 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.252 104632 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.252 104632 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.252 104632 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.252 104632 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.252 104632 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.252 104632 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.252 104632 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.252 104632 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.253 104632 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.253 104632 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.253 104632 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.253 104632 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.253 104632 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.253 104632 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.253 104632 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.253 104632 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.253 104632 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.253 104632 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.253 104632 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.253 104632 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.254 104632 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.254 104632 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.254 104632 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.254 104632 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.254 104632 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.254 104632 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.254 104632 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.254 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.254 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.255 104632 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.255 104632 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.255 104632 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.255 104632 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.255 104632 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.255 104632 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.255 104632 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.255 104632 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.255 104632 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.255 104632 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.256 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.256 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.256 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.256 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.256 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.256 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.256 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.256 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.256 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.257 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.257 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.257 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.257 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.257 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.257 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.257 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.257 104632 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.257 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.257 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.258 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.258 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.258 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.258 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.258 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.258 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.258 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.258 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.258 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.258 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.259 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.259 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.259 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.259 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.259 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.259 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.259 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.259 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.259 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.260 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.260 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.260 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.260 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.260 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.260 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.260 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.260 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.260 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.260 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.261 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.261 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.261 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.261 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.261 104632 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.261 104632 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.261 104632 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.261 104632 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.261 104632 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.261 104632 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.262 104632 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.262 104632 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.262 104632 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.262 104632 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.262 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.262 104632 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.262 104632 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.262 104632 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.262 104632 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.263 104632 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.263 104632 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.263 104632 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.263 104632 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.263 104632 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.263 104632 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.263 104632 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.263 104632 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.263 104632 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.263 104632 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.264 104632 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.264 104632 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.264 104632 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.264 104632 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.264 104632 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.264 104632 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.264 104632 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.264 104632 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.264 104632 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.265 104632 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.265 104632 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.265 104632 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.265 104632 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.265 104632 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.265 104632 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.265 104632 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.265 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.265 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.265 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.266 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.266 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.266 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.266 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.266 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.266 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.266 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.266 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.266 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.266 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.267 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.267 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.267 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.267 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.267 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.267 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.267 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.267 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.267 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.268 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.268 104632 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.268 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.268 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.268 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.268 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.268 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.268 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.268 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.268 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.269 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.269 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.269 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.269 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.269 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.269 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.269 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.269 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.269 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.270 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.270 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.270 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.270 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.270 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.270 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.270 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.270 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.270 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.270 104632 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.271 104632 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.271 104632 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.271 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.271 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.271 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.271 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.271 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.271 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.271 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.272 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.272 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.272 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.272 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.272 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.272 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.272 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.272 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.272 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.272 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.273 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.273 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.273 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.273 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.273 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.273 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.273 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.273 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.273 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.274 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.274 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.274 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.274 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.274 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.274 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.274 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.274 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.274 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.274 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.275 104632 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.275 104632 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.341 104632 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.341 104632 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.341 104632 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.341 104632 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.342 104632 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.354 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 2f671d48-fb23-4421-893d-f2ec1411c819 (UUID: 2f671d48-fb23-4421-893d-f2ec1411c819) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.377 104632 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.378 104632 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.378 104632 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.378 104632 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.380 104632 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.390 104632 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.399 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '2f671d48-fb23-4421-893d-f2ec1411c819'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], external_ids={}, name=2f671d48-fb23-4421-893d-f2ec1411c819, nb_cfg_timestamp=1769416086396, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.399 104632 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f35a6f97b50>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.400 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.400 104632 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.401 104632 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.401 104632 INFO oslo_service.service [-] Starting 1 workers
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.405 104632 DEBUG oslo_service.service [-] Started child 104891 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.408 104891 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-1941149'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.408 104632 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmptlmmc9tv/privsep.sock']
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.426 104891 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.427 104891 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.427 104891 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.430 104891 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.436 104891 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 26 08:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.440 104891 INFO eventlet.wsgi.server [-] (104891) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Jan 26 08:29:05 compute-1 sudo[105021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zesdaywltymfemeuzujibghkrockqtki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416145.454746-954-63203795450384/AnsiballZ_stat.py'
Jan 26 08:29:05 compute-1 sudo[105021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:05 compute-1 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 26 08:29:05 compute-1 python3.9[105023]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:29:06 compute-1 sudo[105021]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:06.090 104632 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 26 08:29:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:06.091 104632 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmptlmmc9tv/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 26 08:29:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.960 105024 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 08:29:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.965 105024 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 08:29:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.967 105024 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Jan 26 08:29:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:05.968 105024 INFO oslo.privsep.daemon [-] privsep daemon running as pid 105024
Jan 26 08:29:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:06.095 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[896dcb26-130a-4c7c-867f-3e950fb03fa2]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:29:06 compute-1 sudo[105151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edufdqvyalaonzmaurybyzoucxeyfqoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416145.454746-954-63203795450384/AnsiballZ_copy.py'
Jan 26 08:29:06 compute-1 sudo[105151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:06.600 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:29:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:06.600 105024 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:29:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:06.601 105024 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:29:06 compute-1 python3.9[105153]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416145.454746-954-63203795450384/.source.yaml _original_basename=.4in4l7pb follow=False checksum=d564b99b419552a86cb3879f877a549d88f0ac62 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:29:06 compute-1 sudo[105151]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.104 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[783fc1a4-76cf-42b8-ad52-a3489a53b528]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.107 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, column=external_ids, values=({'neutron:ovn-metadata-id': 'ce8d3b83-05bb-5ce8-b3e2-2e0f0472e8b4'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:29:07 compute-1 sshd-session[96387]: Connection closed by 192.168.122.30 port 38916
Jan 26 08:29:07 compute-1 sshd-session[96384]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.121 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:29:07 compute-1 systemd[1]: session-23.scope: Deactivated successfully.
Jan 26 08:29:07 compute-1 systemd[1]: session-23.scope: Consumed 44.016s CPU time.
Jan 26 08:29:07 compute-1 systemd-logind[788]: Session 23 logged out. Waiting for processes to exit.
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.130 104632 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.131 104632 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.131 104632 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.131 104632 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.131 104632 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.132 104632 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.132 104632 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.133 104632 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.133 104632 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.133 104632 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 systemd-logind[788]: Removed session 23.
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.134 104632 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.134 104632 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.134 104632 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.135 104632 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.135 104632 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.135 104632 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.136 104632 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.136 104632 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.136 104632 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.136 104632 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.136 104632 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.137 104632 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.137 104632 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.137 104632 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.138 104632 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.138 104632 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.138 104632 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.138 104632 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.139 104632 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.139 104632 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.139 104632 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.139 104632 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.140 104632 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.140 104632 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.140 104632 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.140 104632 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.141 104632 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.141 104632 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.141 104632 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.142 104632 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.142 104632 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.142 104632 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.142 104632 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.143 104632 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.143 104632 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.143 104632 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.143 104632 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.144 104632 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.144 104632 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.144 104632 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.145 104632 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.145 104632 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.145 104632 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.145 104632 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.146 104632 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.146 104632 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.146 104632 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.146 104632 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.147 104632 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.147 104632 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.147 104632 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.147 104632 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.148 104632 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.148 104632 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.148 104632 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.148 104632 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.149 104632 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.149 104632 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.149 104632 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.149 104632 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.150 104632 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.150 104632 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.150 104632 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.150 104632 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.151 104632 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.151 104632 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.151 104632 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.151 104632 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.152 104632 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.152 104632 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.152 104632 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.153 104632 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.153 104632 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.153 104632 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.154 104632 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.154 104632 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.154 104632 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.154 104632 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.155 104632 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.155 104632 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.155 104632 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.155 104632 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.156 104632 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.156 104632 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.156 104632 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.157 104632 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.157 104632 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.157 104632 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.157 104632 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.158 104632 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.158 104632 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.158 104632 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.158 104632 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.159 104632 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.159 104632 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.159 104632 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.159 104632 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.160 104632 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.160 104632 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.160 104632 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.161 104632 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.161 104632 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.161 104632 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.161 104632 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.162 104632 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.162 104632 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.162 104632 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.162 104632 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.163 104632 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.163 104632 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.163 104632 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.164 104632 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.164 104632 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.164 104632 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.164 104632 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.165 104632 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.165 104632 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.165 104632 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.166 104632 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.166 104632 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.166 104632 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.166 104632 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.167 104632 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.167 104632 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.167 104632 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.167 104632 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.168 104632 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.168 104632 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.168 104632 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.168 104632 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.169 104632 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.169 104632 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.169 104632 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.169 104632 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.170 104632 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.170 104632 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.170 104632 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.171 104632 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.171 104632 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.171 104632 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.171 104632 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.172 104632 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.172 104632 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.172 104632 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.172 104632 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.172 104632 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.173 104632 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.173 104632 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.173 104632 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.173 104632 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.174 104632 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.174 104632 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.174 104632 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.174 104632 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.174 104632 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.175 104632 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.175 104632 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.175 104632 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.176 104632 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.176 104632 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.176 104632 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.176 104632 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.176 104632 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.177 104632 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.177 104632 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.177 104632 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.177 104632 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.178 104632 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.178 104632 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.178 104632 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.178 104632 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.178 104632 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.178 104632 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.179 104632 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.179 104632 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.179 104632 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.179 104632 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.179 104632 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.179 104632 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.179 104632 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.180 104632 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.180 104632 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.180 104632 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.180 104632 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.180 104632 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.180 104632 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.181 104632 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.181 104632 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.181 104632 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.181 104632 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.181 104632 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.181 104632 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.182 104632 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.182 104632 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.182 104632 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.182 104632 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.182 104632 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.182 104632 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.182 104632 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.183 104632 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.183 104632 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.183 104632 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.183 104632 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.183 104632 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.183 104632 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.183 104632 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.184 104632 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.184 104632 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.184 104632 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.184 104632 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.184 104632 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.184 104632 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.184 104632 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.185 104632 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.185 104632 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.185 104632 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.185 104632 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.185 104632 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.185 104632 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.185 104632 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.186 104632 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.186 104632 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.186 104632 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.186 104632 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.186 104632 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.186 104632 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.186 104632 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.187 104632 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.187 104632 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.187 104632 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.187 104632 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.187 104632 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.187 104632 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.188 104632 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.188 104632 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.188 104632 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.188 104632 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.188 104632 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.188 104632 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.188 104632 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.189 104632 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.189 104632 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.189 104632 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.189 104632 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.189 104632 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.189 104632 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.189 104632 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.190 104632 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.190 104632 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.190 104632 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.190 104632 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.190 104632 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.190 104632 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.190 104632 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.191 104632 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.191 104632 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.191 104632 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.191 104632 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.191 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.191 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.191 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.192 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.192 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.192 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.192 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.192 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.192 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.193 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.193 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.193 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.193 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.193 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.193 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.193 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.194 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.194 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.194 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.194 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.194 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.194 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.194 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.195 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.195 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.195 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.195 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.195 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.195 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.196 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.196 104632 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.196 104632 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.196 104632 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.196 104632 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.196 104632 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:29:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:29:07.196 104632 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 26 08:29:13 compute-1 sshd-session[105178]: Accepted publickey for zuul from 192.168.122.30 port 60880 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:29:13 compute-1 systemd-logind[788]: New session 24 of user zuul.
Jan 26 08:29:13 compute-1 systemd[1]: Started Session 24 of User zuul.
Jan 26 08:29:13 compute-1 sshd-session[105178]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:29:14 compute-1 python3.9[105331]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:29:15 compute-1 sudo[105485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avcakqweplgikckwsppkrlupcabzqwah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416154.9700258-44-201708281163859/AnsiballZ_command.py'
Jan 26 08:29:15 compute-1 sudo[105485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:15 compute-1 python3.9[105487]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:29:15 compute-1 sudo[105485]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:16 compute-1 sudo[105650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzmndexdmfjciiujdyqunxiouzuqmtmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416156.247698-66-173930302097248/AnsiballZ_systemd_service.py'
Jan 26 08:29:16 compute-1 sudo[105650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:17 compute-1 python3.9[105652]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 08:29:17 compute-1 systemd[1]: Reloading.
Jan 26 08:29:17 compute-1 systemd-rc-local-generator[105678]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:29:17 compute-1 systemd-sysv-generator[105682]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:29:17 compute-1 sudo[105650]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:18 compute-1 python3.9[105836]: ansible-ansible.builtin.service_facts Invoked
Jan 26 08:29:18 compute-1 network[105853]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 08:29:18 compute-1 network[105854]: 'network-scripts' will be removed from distribution in near future.
Jan 26 08:29:18 compute-1 network[105855]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 08:29:25 compute-1 sudo[106114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijdyjeyofcruvgobxjbcwxduqqexksrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416164.8846896-104-114844822988938/AnsiballZ_systemd_service.py'
Jan 26 08:29:25 compute-1 sudo[106114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:25 compute-1 irqbalance[785]: Cannot change IRQ 26 affinity: Operation not permitted
Jan 26 08:29:25 compute-1 irqbalance[785]: IRQ 26 affinity is now unmanaged
Jan 26 08:29:25 compute-1 python3.9[106116]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:29:25 compute-1 sudo[106114]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:26 compute-1 sudo[106267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkbilivmrmusbemyhimvfamcgroufpdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416165.8265584-104-63033209286109/AnsiballZ_systemd_service.py'
Jan 26 08:29:26 compute-1 sudo[106267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:26 compute-1 python3.9[106269]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:29:26 compute-1 sudo[106267]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:26 compute-1 sudo[106420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqnuehphsxodqonkxdxufnihovqbbnvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416166.607663-104-256315285266940/AnsiballZ_systemd_service.py'
Jan 26 08:29:26 compute-1 sudo[106420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:27 compute-1 python3.9[106422]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:29:27 compute-1 sudo[106420]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:27 compute-1 sudo[106573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqzgeloqwryqdfifgiucnbkjxdigxnoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416167.5260353-104-266247518994421/AnsiballZ_systemd_service.py'
Jan 26 08:29:27 compute-1 sudo[106573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:28 compute-1 python3.9[106575]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:29:28 compute-1 sudo[106573]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:28 compute-1 sudo[106726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blzywrtchvklcxydnfdjigvwyncvoydz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416168.5018463-104-125587935442787/AnsiballZ_systemd_service.py'
Jan 26 08:29:28 compute-1 sudo[106726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:29 compute-1 podman[106728]: 2026-01-26 08:29:29.029750943 +0000 UTC m=+0.134597817 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2)
Jan 26 08:29:29 compute-1 python3.9[106729]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:29:29 compute-1 sudo[106726]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:29 compute-1 sudo[106906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwpcjscgguopglqpevoeyeilrcadwlob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416169.3808897-104-135689750429119/AnsiballZ_systemd_service.py'
Jan 26 08:29:29 compute-1 sudo[106906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:30 compute-1 python3.9[106908]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:29:31 compute-1 sudo[106906]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:31 compute-1 sudo[107059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hshabcsxtdrbizqumtsfraynzputeiuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416171.317209-104-202327659016843/AnsiballZ_systemd_service.py'
Jan 26 08:29:31 compute-1 sudo[107059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:31 compute-1 python3.9[107061]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:29:33 compute-1 sudo[107059]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:33 compute-1 podman[107162]: 2026-01-26 08:29:33.853252477 +0000 UTC m=+0.106501586 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 08:29:33 compute-1 sudo[107231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nthpdzktjkiafodddrhizsypzzqaxram ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416173.3950274-208-81694233737326/AnsiballZ_file.py'
Jan 26 08:29:33 compute-1 sudo[107231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:34 compute-1 python3.9[107233]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:29:34 compute-1 sudo[107231]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:34 compute-1 sudo[107383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flpicvpcxeseixnvqikyqydnzcsprzvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416174.2733586-208-274824516845535/AnsiballZ_file.py'
Jan 26 08:29:34 compute-1 sudo[107383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:34 compute-1 python3.9[107385]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:29:34 compute-1 sudo[107383]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:35 compute-1 sudo[107535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxfiqonbezynvfyjrabpwnkvvmmgznro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416174.9428189-208-35021516796498/AnsiballZ_file.py'
Jan 26 08:29:35 compute-1 sudo[107535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:35 compute-1 python3.9[107537]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:29:35 compute-1 sudo[107535]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:36 compute-1 sudo[107687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sohakdyjefamskdaurargvcnhewopbjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416175.651982-208-230943991221827/AnsiballZ_file.py'
Jan 26 08:29:36 compute-1 sudo[107687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:36 compute-1 python3.9[107689]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:29:36 compute-1 sudo[107687]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:36 compute-1 sudo[107839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbfsnucpsgascdastazvyqkzjkqqbrsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416176.4059532-208-134738397368178/AnsiballZ_file.py'
Jan 26 08:29:36 compute-1 sudo[107839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:37 compute-1 python3.9[107841]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:29:37 compute-1 sudo[107839]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:37 compute-1 sudo[107991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvhqvcqbmokpepfnoxljwrbokwdvhayp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416177.213553-208-132273229923675/AnsiballZ_file.py'
Jan 26 08:29:37 compute-1 sudo[107991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:37 compute-1 python3.9[107993]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:29:37 compute-1 sudo[107991]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:38 compute-1 sudo[108143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkfbptrcxibqouradffiusvgefrylhpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416178.0097427-208-15371911166021/AnsiballZ_file.py'
Jan 26 08:29:38 compute-1 sudo[108143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:38 compute-1 python3.9[108145]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:29:38 compute-1 sudo[108143]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:39 compute-1 sudo[108295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdfkavfjqutkdzptvsnboivplwkmmbwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416178.774475-308-121213919685648/AnsiballZ_file.py'
Jan 26 08:29:39 compute-1 sudo[108295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:39 compute-1 python3.9[108297]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:29:39 compute-1 sudo[108295]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:39 compute-1 sudo[108447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgugxzeusmpzrsjzcqrvicgehtialcfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416179.5630293-308-123278751842862/AnsiballZ_file.py'
Jan 26 08:29:39 compute-1 sudo[108447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:40 compute-1 python3.9[108449]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:29:40 compute-1 sudo[108447]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:40 compute-1 sudo[108599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoinepjqgfsmvjqcuwmfktfmjortftbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416180.3288243-308-92294362514091/AnsiballZ_file.py'
Jan 26 08:29:40 compute-1 sudo[108599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:40 compute-1 python3.9[108601]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:29:40 compute-1 sudo[108599]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:41 compute-1 sudo[108751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndnkirhnwtlxrzpsnfgzdzhrglhcyqyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416181.0701895-308-225241388846420/AnsiballZ_file.py'
Jan 26 08:29:41 compute-1 sudo[108751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:41 compute-1 python3.9[108753]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:29:41 compute-1 sudo[108751]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:42 compute-1 sudo[108903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lamjiyjjodjpqksuxfcjwxwcydcdtzog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416181.8844614-308-165022328086043/AnsiballZ_file.py'
Jan 26 08:29:42 compute-1 sudo[108903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:42 compute-1 python3.9[108905]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:29:42 compute-1 sudo[108903]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:42 compute-1 sudo[109055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krbribbbmmfoyfrnhppfhuuxnjahifmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416182.6240904-308-168873093765891/AnsiballZ_file.py'
Jan 26 08:29:42 compute-1 sudo[109055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:43 compute-1 python3.9[109057]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:29:43 compute-1 sudo[109055]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:43 compute-1 sudo[109207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msyazbjcsfeswoqewplpeeratpsqztqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416183.3296018-308-266511519030046/AnsiballZ_file.py'
Jan 26 08:29:43 compute-1 sudo[109207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:43 compute-1 python3.9[109209]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:29:43 compute-1 sudo[109207]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:44 compute-1 sudo[109359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tytkmbmkgiggdtittrxypsznqiclrnqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416184.2721674-410-203420222810440/AnsiballZ_command.py'
Jan 26 08:29:44 compute-1 sudo[109359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:44 compute-1 python3.9[109361]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:29:44 compute-1 sudo[109359]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:45 compute-1 python3.9[109513]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 08:29:46 compute-1 sudo[109663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbaiazazdsnuzfszfqcuchvcudwinpin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416186.4147794-447-74510307316723/AnsiballZ_systemd_service.py'
Jan 26 08:29:46 compute-1 sudo[109663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:47 compute-1 python3.9[109665]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 08:29:47 compute-1 systemd[1]: Reloading.
Jan 26 08:29:47 compute-1 systemd-rc-local-generator[109694]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:29:47 compute-1 systemd-sysv-generator[109697]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:29:47 compute-1 sudo[109663]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:47 compute-1 sudo[109850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxgqumvhubpwxnkvqkccgkfjtoydybue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416187.5989785-462-55695708149713/AnsiballZ_command.py'
Jan 26 08:29:47 compute-1 sudo[109850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:48 compute-1 python3.9[109852]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:29:48 compute-1 sudo[109850]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:48 compute-1 sudo[110003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtvkqmvuzscbgaouiizbyqiikfhogcmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416188.2693486-462-3590942837563/AnsiballZ_command.py'
Jan 26 08:29:48 compute-1 sudo[110003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:48 compute-1 python3.9[110005]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:29:48 compute-1 sudo[110003]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:49 compute-1 sudo[110156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axojnslmvpmxykijzvfjbchepoyblcub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416188.8998432-462-266821300171717/AnsiballZ_command.py'
Jan 26 08:29:49 compute-1 sudo[110156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:49 compute-1 python3.9[110158]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:29:49 compute-1 sudo[110156]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:50 compute-1 sudo[110309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcooxylxxbvputrwslnxsqviywikvpud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416189.6879022-462-61771506078327/AnsiballZ_command.py'
Jan 26 08:29:50 compute-1 sudo[110309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:50 compute-1 python3.9[110311]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:29:50 compute-1 sudo[110309]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:50 compute-1 sudo[110462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbsgseagekgsuzazlacavgcvmpcuntjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416190.4167888-462-236154888472347/AnsiballZ_command.py'
Jan 26 08:29:50 compute-1 sudo[110462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:50 compute-1 python3.9[110464]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:29:50 compute-1 sudo[110462]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:51 compute-1 sudo[110615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysdjgscxaxdmkwajkfcrlzsskrcoksbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416191.1249907-462-229573271080297/AnsiballZ_command.py'
Jan 26 08:29:51 compute-1 sudo[110615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:51 compute-1 python3.9[110617]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:29:51 compute-1 sudo[110615]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:52 compute-1 sudo[110768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-romhuhcgtfmjmkxdiwueaxesukltshut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416191.9161682-462-152800905664135/AnsiballZ_command.py'
Jan 26 08:29:52 compute-1 sudo[110768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:52 compute-1 python3.9[110770]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:29:52 compute-1 sudo[110768]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:53 compute-1 sudo[110921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lktavxxfesntpdunlgfrxltkdofjqurz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416192.9878807-570-154029603059281/AnsiballZ_getent.py'
Jan 26 08:29:53 compute-1 sudo[110921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:53 compute-1 python3.9[110923]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 26 08:29:53 compute-1 sudo[110921]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:54 compute-1 sudo[111074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yscuerujvtswxmieegnwodtamrutslah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416193.9092498-586-249643776803844/AnsiballZ_group.py'
Jan 26 08:29:54 compute-1 sudo[111074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:54 compute-1 python3.9[111076]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 08:29:54 compute-1 groupadd[111077]: group added to /etc/group: name=libvirt, GID=42473
Jan 26 08:29:54 compute-1 groupadd[111077]: group added to /etc/gshadow: name=libvirt
Jan 26 08:29:54 compute-1 groupadd[111077]: new group: name=libvirt, GID=42473
Jan 26 08:29:54 compute-1 sudo[111074]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:55 compute-1 sudo[111232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqmufezoerohbkcrmzlqcosvakddakrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416194.9539592-602-279989762794062/AnsiballZ_user.py'
Jan 26 08:29:55 compute-1 sudo[111232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:55 compute-1 python3.9[111234]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 08:29:55 compute-1 useradd[111236]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Jan 26 08:29:55 compute-1 sudo[111232]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:56 compute-1 sudo[111392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkebqsriezujaoywkpjyiuuolgasjprn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416196.3151784-624-56237811775130/AnsiballZ_setup.py'
Jan 26 08:29:56 compute-1 sudo[111392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:56 compute-1 python3.9[111394]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 08:29:57 compute-1 sudo[111392]: pam_unix(sudo:session): session closed for user root
Jan 26 08:29:57 compute-1 sudo[111476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-negnbrfizjxuftpufloolhbpeggzjgmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416196.3151784-624-56237811775130/AnsiballZ_dnf.py'
Jan 26 08:29:57 compute-1 sudo[111476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:29:57 compute-1 python3.9[111478]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 08:29:59 compute-1 podman[111486]: 2026-01-26 08:29:59.893588998 +0000 UTC m=+0.145893953 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 26 08:30:04 compute-1 podman[111516]: 2026-01-26 08:30:04.786395042 +0000 UTC m=+0.051040553 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 08:30:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:30:05.276 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:30:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:30:05.278 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:30:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:30:05.278 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:30:14 compute-1 sshd-session[111709]: Connection closed by 159.223.236.81 port 43844
Jan 26 08:30:21 compute-1 sshd-session[111716]: Connection closed by 91.231.89.41 port 37419
Jan 26 08:30:25 compute-1 kernel: SELinux:  Converting 2763 SID table entries...
Jan 26 08:30:25 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 08:30:25 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 26 08:30:25 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 08:30:25 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 26 08:30:25 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 08:30:25 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 08:30:25 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 08:30:30 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 26 08:30:30 compute-1 podman[111726]: 2026-01-26 08:30:30.88403735 +0000 UTC m=+0.132488748 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:30:31 compute-1 sshd-session[111717]: Connection closed by 91.231.89.42 port 59611
Jan 26 08:30:34 compute-1 kernel: SELinux:  Converting 2763 SID table entries...
Jan 26 08:30:34 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 08:30:34 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 26 08:30:34 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 08:30:34 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 26 08:30:34 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 08:30:34 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 08:30:34 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 08:30:35 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 26 08:30:35 compute-1 podman[111760]: 2026-01-26 08:30:35.818494954 +0000 UTC m=+0.068226799 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 26 08:30:46 compute-1 sshd-session[111800]: banner exchange: Connection from 91.231.89.45 port 38757: invalid format
Jan 26 08:30:49 compute-1 sshd-session[111878]: Connection closed by 91.231.89.187 port 39771
Jan 26 08:31:01 compute-1 podman[118667]: 2026-01-26 08:31:01.881556025 +0000 UTC m=+0.136782158 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 08:31:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:31:05.278 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:31:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:31:05.279 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:31:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:31:05.279 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:31:06 compute-1 podman[120781]: 2026-01-26 08:31:06.826964589 +0000 UTC m=+0.076620513 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 08:31:32 compute-1 podman[128697]: 2026-01-26 08:31:32.88321088 +0000 UTC m=+0.132300460 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 08:31:37 compute-1 podman[128728]: 2026-01-26 08:31:37.820507835 +0000 UTC m=+0.077085335 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 26 08:31:37 compute-1 kernel: SELinux:  Converting 2764 SID table entries...
Jan 26 08:31:38 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 08:31:38 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 26 08:31:38 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 08:31:38 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 26 08:31:38 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 08:31:38 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 08:31:38 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 08:31:39 compute-1 groupadd[128754]: group added to /etc/group: name=dnsmasq, GID=993
Jan 26 08:31:39 compute-1 groupadd[128754]: group added to /etc/gshadow: name=dnsmasq
Jan 26 08:31:39 compute-1 groupadd[128754]: new group: name=dnsmasq, GID=993
Jan 26 08:31:39 compute-1 useradd[128761]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Jan 26 08:31:39 compute-1 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 26 08:31:39 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 26 08:31:39 compute-1 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 26 08:31:40 compute-1 groupadd[128774]: group added to /etc/group: name=clevis, GID=992
Jan 26 08:31:40 compute-1 groupadd[128774]: group added to /etc/gshadow: name=clevis
Jan 26 08:31:40 compute-1 groupadd[128774]: new group: name=clevis, GID=992
Jan 26 08:31:40 compute-1 useradd[128781]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Jan 26 08:31:40 compute-1 usermod[128791]: add 'clevis' to group 'tss'
Jan 26 08:31:40 compute-1 usermod[128791]: add 'clevis' to shadow group 'tss'
Jan 26 08:31:42 compute-1 polkitd[43618]: Reloading rules
Jan 26 08:31:42 compute-1 polkitd[43618]: Collecting garbage unconditionally...
Jan 26 08:31:42 compute-1 polkitd[43618]: Loading rules from directory /etc/polkit-1/rules.d
Jan 26 08:31:42 compute-1 polkitd[43618]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 26 08:31:42 compute-1 polkitd[43618]: Finished loading, compiling and executing 3 rules
Jan 26 08:31:42 compute-1 polkitd[43618]: Reloading rules
Jan 26 08:31:42 compute-1 polkitd[43618]: Collecting garbage unconditionally...
Jan 26 08:31:42 compute-1 polkitd[43618]: Loading rules from directory /etc/polkit-1/rules.d
Jan 26 08:31:42 compute-1 polkitd[43618]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 26 08:31:42 compute-1 polkitd[43618]: Finished loading, compiling and executing 3 rules
Jan 26 08:31:44 compute-1 groupadd[128981]: group added to /etc/group: name=ceph, GID=167
Jan 26 08:31:44 compute-1 groupadd[128981]: group added to /etc/gshadow: name=ceph
Jan 26 08:31:44 compute-1 groupadd[128981]: new group: name=ceph, GID=167
Jan 26 08:31:44 compute-1 useradd[128987]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Jan 26 08:31:47 compute-1 systemd[1]: Stopping OpenSSH server daemon...
Jan 26 08:31:47 compute-1 sshd[1007]: Received signal 15; terminating.
Jan 26 08:31:47 compute-1 systemd[1]: sshd.service: Deactivated successfully.
Jan 26 08:31:47 compute-1 systemd[1]: Stopped OpenSSH server daemon.
Jan 26 08:31:47 compute-1 systemd[1]: sshd.service: Consumed 1.674s CPU time, read 32.0K from disk, written 0B to disk.
Jan 26 08:31:47 compute-1 systemd[1]: Stopped target sshd-keygen.target.
Jan 26 08:31:47 compute-1 systemd[1]: Stopping sshd-keygen.target...
Jan 26 08:31:47 compute-1 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 08:31:47 compute-1 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 08:31:47 compute-1 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 08:31:47 compute-1 systemd[1]: Reached target sshd-keygen.target.
Jan 26 08:31:47 compute-1 systemd[1]: Starting OpenSSH server daemon...
Jan 26 08:31:47 compute-1 sshd[129506]: Server listening on 0.0.0.0 port 22.
Jan 26 08:31:47 compute-1 sshd[129506]: Server listening on :: port 22.
Jan 26 08:31:47 compute-1 systemd[1]: Started OpenSSH server daemon.
Jan 26 08:31:48 compute-1 sshd-session[129533]: Connection closed by authenticating user root 159.223.236.81 port 39544 [preauth]
Jan 26 08:31:49 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 08:31:49 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 08:31:49 compute-1 systemd[1]: Reloading.
Jan 26 08:31:49 compute-1 systemd-sysv-generator[129767]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:31:49 compute-1 systemd-rc-local-generator[129761]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:31:49 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 08:31:52 compute-1 sudo[111476]: pam_unix(sudo:session): session closed for user root
Jan 26 08:31:53 compute-1 sudo[133933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdpsvxbloorafrmvcmbzhzddvwftdsxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416312.9719656-648-246349019233308/AnsiballZ_systemd.py'
Jan 26 08:31:53 compute-1 sudo[133933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:31:54 compute-1 python3.9[133958]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 08:31:54 compute-1 systemd[1]: Reloading.
Jan 26 08:31:54 compute-1 systemd-rc-local-generator[134359]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:31:54 compute-1 systemd-sysv-generator[134363]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:31:54 compute-1 sudo[133933]: pam_unix(sudo:session): session closed for user root
Jan 26 08:31:55 compute-1 sudo[135030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdaicqogheuarcxbxmqitlotnzlsutcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416314.7203124-648-35542134231582/AnsiballZ_systemd.py'
Jan 26 08:31:55 compute-1 sudo[135030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:31:55 compute-1 python3.9[135051]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 08:31:55 compute-1 systemd[1]: Reloading.
Jan 26 08:31:55 compute-1 systemd-rc-local-generator[135472]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:31:55 compute-1 systemd-sysv-generator[135477]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:31:55 compute-1 sudo[135030]: pam_unix(sudo:session): session closed for user root
Jan 26 08:31:56 compute-1 sudo[136168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbucfrbpfshwxevydlskpwzjrlpjvizf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416315.9100857-648-116968181609035/AnsiballZ_systemd.py'
Jan 26 08:31:56 compute-1 sudo[136168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:31:56 compute-1 python3.9[136187]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 08:31:57 compute-1 systemd[1]: Reloading.
Jan 26 08:31:57 compute-1 systemd-sysv-generator[137243]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:31:57 compute-1 systemd-rc-local-generator[137237]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:31:57 compute-1 sudo[136168]: pam_unix(sudo:session): session closed for user root
Jan 26 08:31:58 compute-1 sudo[137938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jusylhgcjwbwvwqqkxjmdasqgcfjzvdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416318.1428554-648-27861175077435/AnsiballZ_systemd.py'
Jan 26 08:31:58 compute-1 sudo[137938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:31:58 compute-1 python3.9[137959]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 08:31:58 compute-1 systemd[1]: Reloading.
Jan 26 08:31:58 compute-1 systemd-rc-local-generator[138262]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:31:58 compute-1 systemd-sysv-generator[138269]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:31:59 compute-1 sudo[137938]: pam_unix(sudo:session): session closed for user root
Jan 26 08:31:59 compute-1 sudo[138948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkptdkadzrsmypblcmdrrdijeyqpucmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416319.413399-706-13405803248537/AnsiballZ_systemd.py'
Jan 26 08:31:59 compute-1 sudo[138948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:00 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 08:32:00 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 08:32:00 compute-1 systemd[1]: man-db-cache-update.service: Consumed 13.564s CPU time.
Jan 26 08:32:00 compute-1 systemd[1]: run-r7b27f46446044178ba2503b7ba51aed1.service: Deactivated successfully.
Jan 26 08:32:00 compute-1 python3.9[138970]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:00 compute-1 systemd[1]: Reloading.
Jan 26 08:32:00 compute-1 systemd-sysv-generator[139067]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:32:00 compute-1 systemd-rc-local-generator[139064]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:32:00 compute-1 sudo[138948]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:01 compute-1 sudo[139223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roozaivhpqygfhtzowfgdyygnuykqcjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416320.738978-706-105023185830664/AnsiballZ_systemd.py'
Jan 26 08:32:01 compute-1 sudo[139223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:01 compute-1 python3.9[139225]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:01 compute-1 systemd[1]: Reloading.
Jan 26 08:32:01 compute-1 systemd-sysv-generator[139261]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:32:01 compute-1 systemd-rc-local-generator[139257]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:32:01 compute-1 sudo[139223]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:02 compute-1 sudo[139414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njqeyaziyyjncmwbyjkzturfkwblvahz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416321.9958827-706-9002408152050/AnsiballZ_systemd.py'
Jan 26 08:32:02 compute-1 sudo[139414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:02 compute-1 python3.9[139416]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:02 compute-1 systemd[1]: Reloading.
Jan 26 08:32:02 compute-1 systemd-sysv-generator[139451]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:32:02 compute-1 systemd-rc-local-generator[139447]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:32:03 compute-1 sudo[139414]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:03 compute-1 podman[139455]: 2026-01-26 08:32:03.266597175 +0000 UTC m=+0.144756312 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 08:32:03 compute-1 sudo[139628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrspscrtrwcxwowiusxkzzakujynnogo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416323.3189986-706-280988178416172/AnsiballZ_systemd.py'
Jan 26 08:32:03 compute-1 sudo[139628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:04 compute-1 python3.9[139630]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:04 compute-1 sudo[139628]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:04 compute-1 sudo[139783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngzxbrpuuddqdzolhzretubkawhnrddc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416324.3656523-706-239650377766709/AnsiballZ_systemd.py'
Jan 26 08:32:04 compute-1 sudo[139783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:05 compute-1 python3.9[139785]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:05 compute-1 systemd[1]: Reloading.
Jan 26 08:32:05 compute-1 systemd-rc-local-generator[139812]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:32:05 compute-1 systemd-sysv-generator[139818]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:32:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:32:05.280 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:32:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:32:05.281 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:32:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:32:05.281 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:32:05 compute-1 sudo[139783]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:06 compute-1 sudo[139972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-airibeczkqvqadzycaqicopxegujefxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416326.280608-778-257069017163846/AnsiballZ_systemd.py'
Jan 26 08:32:06 compute-1 sudo[139972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:06 compute-1 python3.9[139974]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 08:32:07 compute-1 systemd[1]: Reloading.
Jan 26 08:32:07 compute-1 systemd-rc-local-generator[140003]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:32:07 compute-1 systemd-sysv-generator[140008]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:32:07 compute-1 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 26 08:32:07 compute-1 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 26 08:32:07 compute-1 sudo[139972]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:08 compute-1 sudo[140176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hveloowdilqwzpnsqswzuunvijvquxcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416327.6683235-794-191838545951777/AnsiballZ_systemd.py'
Jan 26 08:32:08 compute-1 sudo[140176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:08 compute-1 podman[140138]: 2026-01-26 08:32:08.088283016 +0000 UTC m=+0.094555050 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:32:08 compute-1 python3.9[140185]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:08 compute-1 sudo[140176]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:09 compute-1 sudo[140339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhmdcrnfipqnaeikpfhurdlzgjcebkcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416328.6636996-794-188346625130553/AnsiballZ_systemd.py'
Jan 26 08:32:09 compute-1 sudo[140339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:09 compute-1 python3.9[140341]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:10 compute-1 sudo[140339]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:10 compute-1 sudo[140494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuprlhojvbnamjbessmahpzfruqlmfmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416330.6123266-794-163855576677522/AnsiballZ_systemd.py'
Jan 26 08:32:10 compute-1 sudo[140494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:11 compute-1 python3.9[140496]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:11 compute-1 sudo[140494]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:11 compute-1 sshd-session[140500]: Unable to negotiate with 91.231.89.104 port 58915: no matching host key type found. Their offer: ssh-rsa,ssh-dss [preauth]
Jan 26 08:32:11 compute-1 sudo[140653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlmdlkaruyoclsdchaqbmfyxwgypdkov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416331.5725038-794-199827388487154/AnsiballZ_systemd.py'
Jan 26 08:32:11 compute-1 sudo[140653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:11 compute-1 sshd-session[140502]: Unable to negotiate with 91.196.152.111 port 45151: no matching host key type found. Their offer: ssh-rsa,ssh-dss [preauth]
Jan 26 08:32:12 compute-1 python3.9[140655]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:12 compute-1 sudo[140653]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:12 compute-1 sudo[140808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xthvqvkwkytdkdhgrdrqzetfpipzucof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416332.5732157-794-158207631027916/AnsiballZ_systemd.py'
Jan 26 08:32:12 compute-1 sudo[140808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:13 compute-1 python3.9[140810]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:13 compute-1 sudo[140808]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:14 compute-1 sudo[140963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwrkqfnzqkwlxrtvcqefbuopeiyttruv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416333.7100146-794-3269894781454/AnsiballZ_systemd.py'
Jan 26 08:32:14 compute-1 sudo[140963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:14 compute-1 python3.9[140965]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:14 compute-1 sudo[140963]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:15 compute-1 sudo[141118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewshafmmzfhiookgbngrccpuebfovwfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416334.7014701-794-39816420288123/AnsiballZ_systemd.py'
Jan 26 08:32:15 compute-1 sudo[141118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:15 compute-1 python3.9[141120]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:15 compute-1 sudo[141118]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:16 compute-1 sudo[141273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtllusgijcefeonqqbknbwamjiccqyek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416335.71016-794-72290510161209/AnsiballZ_systemd.py'
Jan 26 08:32:16 compute-1 sudo[141273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:16 compute-1 python3.9[141275]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:17 compute-1 sudo[141273]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:18 compute-1 sudo[141428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzdgwcuzbprcpfaflchgimidueykukwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416337.7344594-794-194150961146493/AnsiballZ_systemd.py'
Jan 26 08:32:18 compute-1 sudo[141428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:18 compute-1 python3.9[141430]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:18 compute-1 sudo[141428]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:19 compute-1 sudo[141583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvfdwxwrxykipozdnzrscayioijfpzpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416338.7065837-794-173212174644036/AnsiballZ_systemd.py'
Jan 26 08:32:19 compute-1 sudo[141583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:19 compute-1 python3.9[141585]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:20 compute-1 sudo[141583]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:21 compute-1 sudo[141738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cayhftqvacwmycevdjdseihlhpbeifeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416340.7192862-794-134858803320297/AnsiballZ_systemd.py'
Jan 26 08:32:21 compute-1 sudo[141738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:21 compute-1 python3.9[141740]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:21 compute-1 sudo[141738]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:22 compute-1 sudo[141893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbbcladwzytfbtuozgahpehzqyzdtppr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416341.7243967-794-210700889870854/AnsiballZ_systemd.py'
Jan 26 08:32:22 compute-1 sudo[141893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:22 compute-1 python3.9[141895]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:22 compute-1 sudo[141893]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:23 compute-1 sudo[142048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phauiwipvvmjxlxujogaonlhwoxtrfif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416342.6423488-794-155938963050736/AnsiballZ_systemd.py'
Jan 26 08:32:23 compute-1 sudo[142048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:23 compute-1 python3.9[142050]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:23 compute-1 sudo[142048]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:24 compute-1 sudo[142203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noqmyjmdafibccmamiudifxmbokllvfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416343.6959777-794-491345040759/AnsiballZ_systemd.py'
Jan 26 08:32:24 compute-1 sudo[142203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:24 compute-1 python3.9[142205]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 08:32:24 compute-1 sudo[142203]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:25 compute-1 sudo[142358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bygyopbnfwceudkcchuukrgdslwjzrnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416344.9379-998-104186468923805/AnsiballZ_file.py'
Jan 26 08:32:25 compute-1 sudo[142358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:25 compute-1 python3.9[142360]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:32:25 compute-1 sudo[142358]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:26 compute-1 sudo[142510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-telixyikhgmjtwgyutnyptqcbgosvnsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416345.7345753-998-68552933516090/AnsiballZ_file.py'
Jan 26 08:32:26 compute-1 sudo[142510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:26 compute-1 python3.9[142512]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:32:26 compute-1 sudo[142510]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:26 compute-1 sudo[142662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsiwwdwaiupjcnrbjcuwiezqepiokkmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416346.524627-998-240850825562455/AnsiballZ_file.py'
Jan 26 08:32:26 compute-1 sudo[142662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:27 compute-1 python3.9[142664]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:32:27 compute-1 sudo[142662]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:27 compute-1 sudo[142814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpnaessnxwccajqsubudfksijzqabdrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416347.272869-998-142062637715683/AnsiballZ_file.py'
Jan 26 08:32:27 compute-1 sudo[142814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:27 compute-1 python3.9[142816]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:32:27 compute-1 sudo[142814]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:28 compute-1 sudo[142966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrvorrlbxzsexkbwjfoyzdnyggjyyduq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416348.0479453-998-269066086108665/AnsiballZ_file.py'
Jan 26 08:32:28 compute-1 sudo[142966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:28 compute-1 python3.9[142968]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:32:28 compute-1 sudo[142966]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:29 compute-1 sudo[143118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zybhfojgsjnuxavpqnyznurebthzvcmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416348.7767308-998-276710451880099/AnsiballZ_file.py'
Jan 26 08:32:29 compute-1 sudo[143118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:29 compute-1 python3.9[143120]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:32:29 compute-1 sudo[143118]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:30 compute-1 python3.9[143270]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:32:30 compute-1 sudo[143420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwxypzfmwwdfuesqtpowwrlwzxgsfehz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416350.3601456-1100-22350597166946/AnsiballZ_stat.py'
Jan 26 08:32:30 compute-1 sudo[143420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:31 compute-1 python3.9[143422]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:32:31 compute-1 sudo[143420]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:31 compute-1 sudo[143545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noulkfglxlmublndpnmnsvoygrscnoea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416350.3601456-1100-22350597166946/AnsiballZ_copy.py'
Jan 26 08:32:31 compute-1 sudo[143545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:31 compute-1 python3.9[143547]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769416350.3601456-1100-22350597166946/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:31 compute-1 sudo[143545]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:32 compute-1 sudo[143697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dchikrjcvfxzvqbyheqfdfnodjwolrbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416352.0442832-1100-22587647134723/AnsiballZ_stat.py'
Jan 26 08:32:32 compute-1 sudo[143697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:32 compute-1 python3.9[143699]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:32:32 compute-1 sudo[143697]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:33 compute-1 sudo[143822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twoymarnzjavlrsbtdznvoezthxnleeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416352.0442832-1100-22587647134723/AnsiballZ_copy.py'
Jan 26 08:32:33 compute-1 sudo[143822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:33 compute-1 python3.9[143824]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769416352.0442832-1100-22587647134723/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:33 compute-1 sudo[143822]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:33 compute-1 podman[143825]: 2026-01-26 08:32:33.535519565 +0000 UTC m=+0.166403332 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:32:33 compute-1 sudo[144000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpbctkeljzbogwwentriwdhzrituwtqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416353.6443033-1100-257973196413415/AnsiballZ_stat.py'
Jan 26 08:32:33 compute-1 sudo[144000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:34 compute-1 python3.9[144002]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:32:34 compute-1 sudo[144000]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:34 compute-1 sudo[144125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdsrhkbpfvaiazgcqabgjsxzcieldcvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416353.6443033-1100-257973196413415/AnsiballZ_copy.py'
Jan 26 08:32:34 compute-1 sudo[144125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:34 compute-1 python3.9[144127]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769416353.6443033-1100-257973196413415/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:34 compute-1 sudo[144125]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:35 compute-1 sudo[144277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gawdcnzlpsbjhfdodiceppkykjvmxxks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416355.110944-1100-27598807045782/AnsiballZ_stat.py'
Jan 26 08:32:35 compute-1 sudo[144277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:35 compute-1 python3.9[144279]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:32:35 compute-1 sudo[144277]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:36 compute-1 sudo[144402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgmstpisdirgpjzqxuqbpfgpeyezeeac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416355.110944-1100-27598807045782/AnsiballZ_copy.py'
Jan 26 08:32:36 compute-1 sudo[144402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:36 compute-1 python3.9[144404]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769416355.110944-1100-27598807045782/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:36 compute-1 sudo[144402]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:36 compute-1 sudo[144554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhogjeoptafxdyeplakanhpzjhkdwttz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416356.5861297-1100-256633872468132/AnsiballZ_stat.py'
Jan 26 08:32:36 compute-1 sudo[144554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:37 compute-1 python3.9[144556]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:32:37 compute-1 sudo[144554]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:37 compute-1 sudo[144679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbxoeibxpxhcusrreagrewvwuicrobwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416356.5861297-1100-256633872468132/AnsiballZ_copy.py'
Jan 26 08:32:37 compute-1 sudo[144679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:37 compute-1 python3.9[144681]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769416356.5861297-1100-256633872468132/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:37 compute-1 sudo[144679]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:38 compute-1 sudo[144844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfitgfpnsrconmpkwurwzdkssyjstpjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416358.3189213-1100-275528745257756/AnsiballZ_stat.py'
Jan 26 08:32:38 compute-1 sudo[144844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:38 compute-1 podman[144805]: 2026-01-26 08:32:38.725257269 +0000 UTC m=+0.084732817 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 26 08:32:38 compute-1 python3.9[144852]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:32:38 compute-1 sudo[144844]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:39 compute-1 sudo[144975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mekimjcbkumyhbahuyvpbeskcfsomhbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416358.3189213-1100-275528745257756/AnsiballZ_copy.py'
Jan 26 08:32:39 compute-1 sudo[144975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:39 compute-1 python3.9[144977]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769416358.3189213-1100-275528745257756/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:39 compute-1 sudo[144975]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:40 compute-1 sudo[145127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekmziksmefaykjyzeowuzdefpeptfuqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416359.8246827-1100-228815415962209/AnsiballZ_stat.py'
Jan 26 08:32:40 compute-1 sudo[145127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:40 compute-1 python3.9[145129]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:32:40 compute-1 sudo[145127]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:40 compute-1 sudo[145250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oabopiqyhwbntkqttkaihwysweuvuecu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416359.8246827-1100-228815415962209/AnsiballZ_copy.py'
Jan 26 08:32:40 compute-1 sudo[145250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:41 compute-1 python3.9[145252]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769416359.8246827-1100-228815415962209/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:41 compute-1 sudo[145250]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:41 compute-1 sudo[145402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uifwvywahiaiysfmxltmaigwxzrxiynk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416361.257739-1100-263640131986461/AnsiballZ_stat.py'
Jan 26 08:32:41 compute-1 sudo[145402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:41 compute-1 python3.9[145404]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:32:41 compute-1 sudo[145402]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:42 compute-1 sudo[145527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwtqiakqhzkmbelqcyirgvhynxheoniu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416361.257739-1100-263640131986461/AnsiballZ_copy.py'
Jan 26 08:32:42 compute-1 sudo[145527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:42 compute-1 python3.9[145529]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769416361.257739-1100-263640131986461/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:42 compute-1 sudo[145527]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:43 compute-1 sudo[145679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ganbqubzlsqnpjgyqnsgmqccchkphzzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416362.91733-1326-78250854233755/AnsiballZ_command.py'
Jan 26 08:32:43 compute-1 sudo[145679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:43 compute-1 python3.9[145681]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 26 08:32:43 compute-1 sudo[145679]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:44 compute-1 sudo[145832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koyxnurdrpbfosdpkcyksuxvnenrrfrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416364.0578043-1344-64327046862099/AnsiballZ_file.py'
Jan 26 08:32:44 compute-1 sudo[145832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:44 compute-1 python3.9[145834]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:44 compute-1 sudo[145832]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:45 compute-1 sudo[145984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zulracthsnyoqozaishaccllmfnuzjtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416364.8610752-1344-171967001975285/AnsiballZ_file.py'
Jan 26 08:32:45 compute-1 sudo[145984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:45 compute-1 python3.9[145986]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:45 compute-1 sudo[145984]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:45 compute-1 sudo[146136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzgjbgpqrpddbixzdkfocrjtskcdyqcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416365.5834699-1344-69643035307468/AnsiballZ_file.py'
Jan 26 08:32:45 compute-1 sudo[146136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:46 compute-1 python3.9[146138]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:46 compute-1 sudo[146136]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:46 compute-1 sudo[146288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yftbjyexxztpmyqosxwwfpdimdddtmbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416366.5295022-1344-90678474075357/AnsiballZ_file.py'
Jan 26 08:32:46 compute-1 sudo[146288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:47 compute-1 python3.9[146290]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:47 compute-1 sudo[146288]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:47 compute-1 sudo[146440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfnsdchhlteitbcezxfymtiolecfigcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416367.2632487-1344-50690945064249/AnsiballZ_file.py'
Jan 26 08:32:47 compute-1 sudo[146440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:47 compute-1 python3.9[146442]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:47 compute-1 sudo[146440]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:48 compute-1 sudo[146592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwkwwimikywuanctmtgzpjzdbigvbryp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416368.0741155-1344-251639428497324/AnsiballZ_file.py'
Jan 26 08:32:48 compute-1 sudo[146592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:48 compute-1 python3.9[146594]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:48 compute-1 sudo[146592]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:49 compute-1 sudo[146744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obkkbccgiowhukjoiugmarwdazamxdvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416368.781753-1344-266485621140067/AnsiballZ_file.py'
Jan 26 08:32:49 compute-1 sudo[146744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:49 compute-1 python3.9[146746]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:49 compute-1 sudo[146744]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:50 compute-1 sudo[146896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcpzqhsvssuabxrswpfsyueigjgycjmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416369.7607517-1344-22516596910728/AnsiballZ_file.py'
Jan 26 08:32:50 compute-1 sudo[146896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:50 compute-1 python3.9[146898]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:50 compute-1 sudo[146896]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:50 compute-1 sudo[147048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmonlqkuvojnhbdtwyiavpjculbonfye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416370.5325503-1344-66709596985185/AnsiballZ_file.py'
Jan 26 08:32:50 compute-1 sudo[147048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:51 compute-1 python3.9[147050]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:51 compute-1 sudo[147048]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:51 compute-1 sudo[147200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryppbnficyfvyerappbbiceytfumbsel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416371.2996788-1344-278970642917419/AnsiballZ_file.py'
Jan 26 08:32:51 compute-1 sudo[147200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:51 compute-1 python3.9[147202]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:51 compute-1 sudo[147200]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:52 compute-1 sudo[147352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghydkvcijwqgwxbwtssyhvikyhnmcgki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416372.0607293-1344-52001443500947/AnsiballZ_file.py'
Jan 26 08:32:52 compute-1 sudo[147352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:52 compute-1 python3.9[147354]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:52 compute-1 sudo[147352]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:53 compute-1 sudo[147504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiatxjatdxdqcbfkbuihewmhoagpcqxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416372.7823658-1344-242400848392830/AnsiballZ_file.py'
Jan 26 08:32:53 compute-1 sudo[147504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:53 compute-1 python3.9[147506]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:53 compute-1 sudo[147504]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:53 compute-1 sudo[147656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixnzwyeadcboqpecjpghqqlygohmoilr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416373.4101949-1344-78506175995360/AnsiballZ_file.py'
Jan 26 08:32:53 compute-1 sudo[147656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:53 compute-1 python3.9[147658]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:53 compute-1 sudo[147656]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:54 compute-1 sudo[147808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-febwhtmxgdvayvafzmhycqpcnbuulvvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416374.1119926-1344-160757240332327/AnsiballZ_file.py'
Jan 26 08:32:54 compute-1 sudo[147808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:54 compute-1 python3.9[147810]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:54 compute-1 sudo[147808]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:55 compute-1 sudo[147960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgzjadyxhqikovrpyjfgdvrzdixurjre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416374.936487-1542-78445193765583/AnsiballZ_stat.py'
Jan 26 08:32:55 compute-1 sudo[147960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:55 compute-1 python3.9[147962]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:32:55 compute-1 sudo[147960]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:56 compute-1 sudo[148085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggyxawxwwhazmlsxvhaikflxrvihgrgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416374.936487-1542-78445193765583/AnsiballZ_copy.py'
Jan 26 08:32:56 compute-1 sudo[148085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:56 compute-1 python3.9[148087]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416374.936487-1542-78445193765583/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:56 compute-1 sudo[148085]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:56 compute-1 sshd-session[147963]: Connection closed by authenticating user root 159.223.236.81 port 41448 [preauth]
Jan 26 08:32:56 compute-1 sudo[148237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwzzebcfjpeocelsciwwbxobedesbbhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416376.4818554-1542-236192135785551/AnsiballZ_stat.py'
Jan 26 08:32:56 compute-1 sudo[148237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:57 compute-1 python3.9[148239]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:32:57 compute-1 sudo[148237]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:57 compute-1 sudo[148360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-soynedwmaulrxqknowxwxtgpvbwtocxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416376.4818554-1542-236192135785551/AnsiballZ_copy.py'
Jan 26 08:32:57 compute-1 sudo[148360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:57 compute-1 python3.9[148362]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416376.4818554-1542-236192135785551/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:57 compute-1 sudo[148360]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:58 compute-1 sudo[148512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bobzewmnrpxbsdwpbbyxfnnfhtmiytyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416377.9113233-1542-212059118925822/AnsiballZ_stat.py'
Jan 26 08:32:58 compute-1 sudo[148512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:58 compute-1 python3.9[148514]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:32:58 compute-1 sudo[148512]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:58 compute-1 sudo[148635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bssegumkrzpobeyqkzrcbwoljqjqmmho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416377.9113233-1542-212059118925822/AnsiballZ_copy.py'
Jan 26 08:32:58 compute-1 sudo[148635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:59 compute-1 python3.9[148637]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416377.9113233-1542-212059118925822/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:32:59 compute-1 sudo[148635]: pam_unix(sudo:session): session closed for user root
Jan 26 08:32:59 compute-1 sudo[148787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaslwerpflxyrnyazmyqdsnwiwvgqjyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416379.3613636-1542-237443210396440/AnsiballZ_stat.py'
Jan 26 08:32:59 compute-1 sudo[148787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:32:59 compute-1 python3.9[148789]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:32:59 compute-1 sudo[148787]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:00 compute-1 sudo[148910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjxzrtjicnxbcjesztkkmdswvzpiyfdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416379.3613636-1542-237443210396440/AnsiballZ_copy.py'
Jan 26 08:33:00 compute-1 sudo[148910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:00 compute-1 python3.9[148912]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416379.3613636-1542-237443210396440/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:00 compute-1 sudo[148910]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:01 compute-1 sudo[149062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypqwzrhuwnkmrlqxlwfeoepqcsnnxqez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416380.7859747-1542-277048575156101/AnsiballZ_stat.py'
Jan 26 08:33:01 compute-1 sudo[149062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:01 compute-1 python3.9[149064]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:01 compute-1 sudo[149062]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:01 compute-1 sudo[149185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kneasvuclupctizscvwsckauhknyrlmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416380.7859747-1542-277048575156101/AnsiballZ_copy.py'
Jan 26 08:33:01 compute-1 sudo[149185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:01 compute-1 python3.9[149187]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416380.7859747-1542-277048575156101/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:02 compute-1 sudo[149185]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:02 compute-1 sudo[149337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgzykdhdmxccnjhtldwkmnbxhvesrfqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416382.1637728-1542-126098525534915/AnsiballZ_stat.py'
Jan 26 08:33:02 compute-1 sudo[149337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:02 compute-1 python3.9[149339]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:02 compute-1 sudo[149337]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:03 compute-1 sudo[149460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlhhletgirwxwsxdcrdjanhrwfmuqrdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416382.1637728-1542-126098525534915/AnsiballZ_copy.py'
Jan 26 08:33:03 compute-1 sudo[149460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:03 compute-1 python3.9[149462]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416382.1637728-1542-126098525534915/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:03 compute-1 sudo[149460]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:03 compute-1 sudo[149635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzwnnlvewcsunfizemwzgisleruqzlhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416383.5012789-1542-73871337754763/AnsiballZ_stat.py'
Jan 26 08:33:03 compute-1 sudo[149635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:03 compute-1 podman[149562]: 2026-01-26 08:33:03.893756518 +0000 UTC m=+0.147137571 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller)
Jan 26 08:33:04 compute-1 python3.9[149640]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:04 compute-1 sudo[149635]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:04 compute-1 sudo[149761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ommmdktbrohwfxiixckwjanyjkrovdab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416383.5012789-1542-73871337754763/AnsiballZ_copy.py'
Jan 26 08:33:04 compute-1 sudo[149761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:04 compute-1 python3.9[149763]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416383.5012789-1542-73871337754763/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:04 compute-1 sudo[149761]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:05 compute-1 sudo[149913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehpzwoisdpefhcrlusaldnnfqqwqasfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416384.8506927-1542-126779430061885/AnsiballZ_stat.py'
Jan 26 08:33:05 compute-1 sudo[149913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:33:05.282 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:33:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:33:05.282 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:33:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:33:05.282 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:33:05 compute-1 python3.9[149915]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:05 compute-1 sudo[149913]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:05 compute-1 sudo[150036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feopeljoeerfkyrkdipmizdtttpffxur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416384.8506927-1542-126779430061885/AnsiballZ_copy.py'
Jan 26 08:33:05 compute-1 sudo[150036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:06 compute-1 python3.9[150038]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416384.8506927-1542-126779430061885/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:06 compute-1 sudo[150036]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:06 compute-1 sudo[150188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzakfnqbxisdaxvivkggobtunvwykvyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416386.2088876-1542-209000994406315/AnsiballZ_stat.py'
Jan 26 08:33:06 compute-1 sudo[150188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:06 compute-1 python3.9[150190]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:06 compute-1 sudo[150188]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:07 compute-1 sudo[150311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahejwwdpqidwppviwwnjutsshhdvosnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416386.2088876-1542-209000994406315/AnsiballZ_copy.py'
Jan 26 08:33:07 compute-1 sudo[150311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:07 compute-1 python3.9[150313]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416386.2088876-1542-209000994406315/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:07 compute-1 sudo[150311]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:08 compute-1 sudo[150463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxfumqftiohdtpfjibnaubcdaqhhlvkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416387.6328702-1542-206250444792026/AnsiballZ_stat.py'
Jan 26 08:33:08 compute-1 sudo[150463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:08 compute-1 python3.9[150465]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:08 compute-1 sudo[150463]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:08 compute-1 sudo[150586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxjtglgmhoexlqtcaetjiazfrljwldje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416387.6328702-1542-206250444792026/AnsiballZ_copy.py'
Jan 26 08:33:08 compute-1 sudo[150586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:08 compute-1 python3.9[150588]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416387.6328702-1542-206250444792026/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:08 compute-1 sudo[150586]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:09 compute-1 sudo[150753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpimfqhtryrnqupjpxuxwjrqsfiynnlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416389.0714211-1542-269638973556682/AnsiballZ_stat.py'
Jan 26 08:33:09 compute-1 sudo[150753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:09 compute-1 podman[150712]: 2026-01-26 08:33:09.434775317 +0000 UTC m=+0.074853287 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 26 08:33:09 compute-1 python3.9[150759]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:09 compute-1 sudo[150753]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:10 compute-1 sudo[150880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufqzamshrihjnmrliokeaasfpqyketyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416389.0714211-1542-269638973556682/AnsiballZ_copy.py'
Jan 26 08:33:10 compute-1 sudo[150880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:10 compute-1 python3.9[150882]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416389.0714211-1542-269638973556682/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:10 compute-1 sudo[150880]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:10 compute-1 sudo[151032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfhmeoevjywpdmbzwfrpncfvvaudfzae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416390.4244924-1542-24494439540287/AnsiballZ_stat.py'
Jan 26 08:33:10 compute-1 sudo[151032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:10 compute-1 python3.9[151034]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:11 compute-1 sudo[151032]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:11 compute-1 sudo[151155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wisxpkoilwhuhpabzremoyhqeguzrlwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416390.4244924-1542-24494439540287/AnsiballZ_copy.py'
Jan 26 08:33:11 compute-1 sudo[151155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:11 compute-1 python3.9[151157]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416390.4244924-1542-24494439540287/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:11 compute-1 sudo[151155]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:12 compute-1 sudo[151307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqxfwxfifydrfbhtsrkinyhwyrvdackm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416391.8993332-1542-23863418847154/AnsiballZ_stat.py'
Jan 26 08:33:12 compute-1 sudo[151307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:12 compute-1 python3.9[151309]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:12 compute-1 sudo[151307]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:12 compute-1 sudo[151430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cblesotronnzhyccnzgdlegpgmlaphvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416391.8993332-1542-23863418847154/AnsiballZ_copy.py'
Jan 26 08:33:12 compute-1 sudo[151430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:13 compute-1 python3.9[151432]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416391.8993332-1542-23863418847154/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:13 compute-1 sudo[151430]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:13 compute-1 sudo[151582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiwyalcddjapebqufpojotaomdqbmxbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416393.3429847-1542-213743459846408/AnsiballZ_stat.py'
Jan 26 08:33:13 compute-1 sudo[151582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:13 compute-1 python3.9[151584]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:13 compute-1 sudo[151582]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:14 compute-1 sudo[151705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvdgogadsikmuudimhjxpafsfpyypkcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416393.3429847-1542-213743459846408/AnsiballZ_copy.py'
Jan 26 08:33:14 compute-1 sudo[151705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:14 compute-1 python3.9[151707]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416393.3429847-1542-213743459846408/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:14 compute-1 sudo[151705]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:15 compute-1 python3.9[151857]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:33:16 compute-1 sudo[152010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onivvgjryvafxwprmnbnshbnsrxiagtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416395.74169-1954-206006971769030/AnsiballZ_seboolean.py'
Jan 26 08:33:16 compute-1 sudo[152010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:16 compute-1 python3.9[152012]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 26 08:33:17 compute-1 sudo[152010]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:18 compute-1 sudo[152166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiyyiwiynupjmewxeyqgqxqrxhpxjpxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416397.9593236-1970-80284458997727/AnsiballZ_copy.py'
Jan 26 08:33:18 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 26 08:33:18 compute-1 sudo[152166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:18 compute-1 python3.9[152168]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:18 compute-1 sudo[152166]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:19 compute-1 sudo[152318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzxkjlucpaokdaschmivbyokfvcihrrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416398.8255057-1970-101673602382074/AnsiballZ_copy.py'
Jan 26 08:33:19 compute-1 sudo[152318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:19 compute-1 python3.9[152320]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:19 compute-1 sudo[152318]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:20 compute-1 sudo[152470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cglampysksrrrxwsjswsnlgkhmnwygci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416399.628897-1970-201756254061505/AnsiballZ_copy.py'
Jan 26 08:33:20 compute-1 sudo[152470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:20 compute-1 python3.9[152472]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:20 compute-1 sudo[152470]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:20 compute-1 sudo[152622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfrsgkitqyogdszkityktaynnquzjyio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416400.426966-1970-60419997144497/AnsiballZ_copy.py'
Jan 26 08:33:20 compute-1 sudo[152622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:20 compute-1 python3.9[152624]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:20 compute-1 sudo[152622]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:21 compute-1 sudo[152774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twywkzgoyxjzehmroeffixdqciybsudv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416401.1653683-1970-35182175953184/AnsiballZ_copy.py'
Jan 26 08:33:21 compute-1 sudo[152774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:21 compute-1 python3.9[152776]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:21 compute-1 sudo[152774]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:22 compute-1 sudo[152926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrgcrpfepujpimevbyenahjmosoyjbsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416401.9771569-2042-236962069051665/AnsiballZ_copy.py'
Jan 26 08:33:22 compute-1 sudo[152926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:22 compute-1 python3.9[152928]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:22 compute-1 sudo[152926]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:23 compute-1 sudo[153078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzykrkdrgkaeddkeeuithxyyyzmrqyyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416402.7267935-2042-239989982391411/AnsiballZ_copy.py'
Jan 26 08:33:23 compute-1 sudo[153078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:23 compute-1 python3.9[153080]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:23 compute-1 sudo[153078]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:23 compute-1 sudo[153230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfdqxbgybttauoxmemigtblwgtqbfkjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416403.4326017-2042-272895591748224/AnsiballZ_copy.py'
Jan 26 08:33:23 compute-1 sudo[153230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:24 compute-1 python3.9[153232]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:24 compute-1 sudo[153230]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:24 compute-1 sudo[153382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjqgxfcnrrmwyfabsqrcsvirwazozfvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416404.2050097-2042-111821119498085/AnsiballZ_copy.py'
Jan 26 08:33:24 compute-1 sudo[153382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:24 compute-1 python3.9[153384]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:24 compute-1 sudo[153382]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:25 compute-1 sudo[153534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnqpgdbhfgggivchcrscocxswrxgddme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416404.9257958-2042-162669391769814/AnsiballZ_copy.py'
Jan 26 08:33:25 compute-1 sudo[153534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:25 compute-1 python3.9[153536]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:25 compute-1 sudo[153534]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:26 compute-1 sudo[153686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryhrnvwvdgzfmgebjunvrbflgigmyjyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416405.8708572-2114-133658422954025/AnsiballZ_systemd.py'
Jan 26 08:33:26 compute-1 sudo[153686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:26 compute-1 python3.9[153688]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 08:33:26 compute-1 systemd[1]: Reloading.
Jan 26 08:33:26 compute-1 systemd-rc-local-generator[153714]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:33:26 compute-1 systemd-sysv-generator[153717]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:33:26 compute-1 systemd[1]: Starting libvirt logging daemon socket...
Jan 26 08:33:26 compute-1 systemd[1]: Listening on libvirt logging daemon socket.
Jan 26 08:33:26 compute-1 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 26 08:33:26 compute-1 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 26 08:33:26 compute-1 systemd[1]: Starting libvirt logging daemon...
Jan 26 08:33:26 compute-1 systemd[1]: Started libvirt logging daemon.
Jan 26 08:33:27 compute-1 sudo[153686]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:27 compute-1 sudo[153879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmcsncfyyvpspxvndcwcijrrvxcizqnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416407.2168362-2114-91711068329218/AnsiballZ_systemd.py'
Jan 26 08:33:27 compute-1 sudo[153879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:27 compute-1 python3.9[153881]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 08:33:27 compute-1 systemd[1]: Reloading.
Jan 26 08:33:28 compute-1 systemd-sysv-generator[153913]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:33:28 compute-1 systemd-rc-local-generator[153909]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:33:28 compute-1 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 26 08:33:28 compute-1 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 26 08:33:28 compute-1 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 26 08:33:28 compute-1 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 26 08:33:28 compute-1 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 26 08:33:28 compute-1 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 26 08:33:28 compute-1 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 26 08:33:28 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Jan 26 08:33:28 compute-1 systemd[1]: Started libvirt nodedev daemon.
Jan 26 08:33:28 compute-1 sudo[153879]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:28 compute-1 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 26 08:33:28 compute-1 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 26 08:33:28 compute-1 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 26 08:33:28 compute-1 sudo[154103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emgozyojfqjghkdskftroyburxnepujf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416408.4323628-2114-260883180039401/AnsiballZ_systemd.py'
Jan 26 08:33:28 compute-1 sudo[154103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:29 compute-1 python3.9[154105]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 08:33:29 compute-1 systemd[1]: Reloading.
Jan 26 08:33:29 compute-1 systemd-sysv-generator[154138]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:33:29 compute-1 systemd-rc-local-generator[154134]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:33:29 compute-1 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 26 08:33:29 compute-1 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 26 08:33:29 compute-1 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 26 08:33:29 compute-1 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 26 08:33:29 compute-1 systemd[1]: Starting libvirt proxy daemon...
Jan 26 08:33:29 compute-1 systemd[1]: Started libvirt proxy daemon.
Jan 26 08:33:29 compute-1 sudo[154103]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:29 compute-1 setroubleshoot[153918]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l cd626fff-5157-406a-bd64-2aea1e0f16c5
Jan 26 08:33:29 compute-1 setroubleshoot[153918]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 26 08:33:29 compute-1 setroubleshoot[153918]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l cd626fff-5157-406a-bd64-2aea1e0f16c5
Jan 26 08:33:29 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 08:33:29 compute-1 setroubleshoot[153918]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 26 08:33:30 compute-1 sudo[154318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzozkfxvicufkdhuzygbdlcbcpkfrtut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416409.6494212-2114-206467817595033/AnsiballZ_systemd.py'
Jan 26 08:33:30 compute-1 sudo[154318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:30 compute-1 python3.9[154320]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 08:33:30 compute-1 systemd[1]: Reloading.
Jan 26 08:33:30 compute-1 systemd-rc-local-generator[154347]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:33:30 compute-1 systemd-sysv-generator[154350]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:33:30 compute-1 systemd[1]: Listening on libvirt locking daemon socket.
Jan 26 08:33:30 compute-1 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 26 08:33:30 compute-1 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 26 08:33:30 compute-1 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 26 08:33:30 compute-1 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 26 08:33:30 compute-1 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 26 08:33:30 compute-1 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 26 08:33:30 compute-1 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 26 08:33:30 compute-1 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 26 08:33:30 compute-1 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 26 08:33:30 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Jan 26 08:33:30 compute-1 systemd[1]: Started libvirt QEMU daemon.
Jan 26 08:33:30 compute-1 sudo[154318]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:31 compute-1 sudo[154533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqndlpnisxztfokmqwautxfourjzewvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416410.952064-2114-68374808991554/AnsiballZ_systemd.py'
Jan 26 08:33:31 compute-1 sudo[154533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:31 compute-1 python3.9[154535]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 08:33:31 compute-1 systemd[1]: Reloading.
Jan 26 08:33:31 compute-1 systemd-sysv-generator[154568]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:33:31 compute-1 systemd-rc-local-generator[154565]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:33:31 compute-1 systemd[1]: Starting libvirt secret daemon socket...
Jan 26 08:33:31 compute-1 systemd[1]: Listening on libvirt secret daemon socket.
Jan 26 08:33:31 compute-1 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 26 08:33:31 compute-1 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 26 08:33:31 compute-1 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 26 08:33:31 compute-1 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 26 08:33:31 compute-1 systemd[1]: Starting libvirt secret daemon...
Jan 26 08:33:31 compute-1 systemd[1]: Started libvirt secret daemon.
Jan 26 08:33:31 compute-1 sudo[154533]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:32 compute-1 sudo[154746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijkozyefkqrdjkzxfuakgygyboklrfpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416412.2653015-2188-120789326647792/AnsiballZ_file.py'
Jan 26 08:33:32 compute-1 sudo[154746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:32 compute-1 python3.9[154748]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:32 compute-1 sudo[154746]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:33 compute-1 sudo[154898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpsckeervmfbybuwnnxrfheaavoewode ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416412.9663591-2204-77649974806443/AnsiballZ_find.py'
Jan 26 08:33:33 compute-1 sudo[154898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:33 compute-1 python3.9[154900]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 08:33:33 compute-1 sudo[154898]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:34 compute-1 sudo[155058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjhyfjfjfzvdvjfgsugczdirbziihkeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416413.9560254-2232-181745017077819/AnsiballZ_stat.py'
Jan 26 08:33:34 compute-1 sudo[155058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:34 compute-1 podman[155024]: 2026-01-26 08:33:34.417510494 +0000 UTC m=+0.116644333 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Jan 26 08:33:34 compute-1 python3.9[155068]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:34 compute-1 sudo[155058]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:35 compute-1 sudo[155198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dguyplweytrljigkucfeimcmkojcvzeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416413.9560254-2232-181745017077819/AnsiballZ_copy.py'
Jan 26 08:33:35 compute-1 sudo[155198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:35 compute-1 python3.9[155200]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416413.9560254-2232-181745017077819/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:35 compute-1 sudo[155198]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:36 compute-1 sudo[155350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oylceytqgrohieplynfnjnejtdgukbns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416415.7652478-2264-78774201292679/AnsiballZ_file.py'
Jan 26 08:33:36 compute-1 sudo[155350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:36 compute-1 python3.9[155352]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:36 compute-1 sudo[155350]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:36 compute-1 sudo[155502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofyjvxxxdhuxqwararhtxxbjzoacuhjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416416.5210562-2280-232901048515174/AnsiballZ_stat.py'
Jan 26 08:33:36 compute-1 sudo[155502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:37 compute-1 python3.9[155504]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:37 compute-1 sudo[155502]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:37 compute-1 sudo[155580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okomfepbmqyrssooxzukexomyohmnwkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416416.5210562-2280-232901048515174/AnsiballZ_file.py'
Jan 26 08:33:37 compute-1 sudo[155580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:37 compute-1 python3.9[155582]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:37 compute-1 sudo[155580]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:38 compute-1 sudo[155732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asdwemzjeasdrtzgunlqbmiqqknnzwly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416418.0217118-2304-47174126705588/AnsiballZ_stat.py'
Jan 26 08:33:38 compute-1 sudo[155732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:38 compute-1 python3.9[155734]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:38 compute-1 sudo[155732]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:39 compute-1 sudo[155810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urdrvcunolygfwasinzsedzltcvvyqwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416418.0217118-2304-47174126705588/AnsiballZ_file.py'
Jan 26 08:33:39 compute-1 sudo[155810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:39 compute-1 python3.9[155812]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.8gh3bimq recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:39 compute-1 sudo[155810]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:39 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 26 08:33:39 compute-1 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 26 08:33:39 compute-1 podman[155936]: 2026-01-26 08:33:39.757048525 +0000 UTC m=+0.054652059 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:33:39 compute-1 sudo[155977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtligxotisvdqnwqdtqakfpyjrsiafoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416419.4506779-2328-42461707674986/AnsiballZ_stat.py'
Jan 26 08:33:39 compute-1 sudo[155977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:39 compute-1 python3.9[155983]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:40 compute-1 sudo[155977]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:40 compute-1 sudo[156059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nazglmhudlikvielkykxexshmnpvyicg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416419.4506779-2328-42461707674986/AnsiballZ_file.py'
Jan 26 08:33:40 compute-1 sudo[156059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:40 compute-1 python3.9[156061]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:40 compute-1 sudo[156059]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:40 compute-1 sudo[156211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azshptnpauanjmdirnttsykbhlahkpzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416420.6833365-2354-147370603390719/AnsiballZ_command.py'
Jan 26 08:33:40 compute-1 sudo[156211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:41 compute-1 python3.9[156213]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:33:41 compute-1 sudo[156211]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:41 compute-1 sudo[156364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmprkkvbqsbsjfwfgotkwafmatykjreg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769416421.362146-2370-214568536934810/AnsiballZ_edpm_nftables_from_files.py'
Jan 26 08:33:41 compute-1 sudo[156364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:42 compute-1 python3[156366]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 08:33:42 compute-1 sudo[156364]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:42 compute-1 sudo[156516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcutqenpzswcwyqaofeqdzbaaqztqhpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416422.324252-2386-41588922883593/AnsiballZ_stat.py'
Jan 26 08:33:42 compute-1 sudo[156516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:42 compute-1 python3.9[156518]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:42 compute-1 sudo[156516]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:43 compute-1 sudo[156594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjlfkpnlpeeqiglttnkwksvjvssvtxjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416422.324252-2386-41588922883593/AnsiballZ_file.py'
Jan 26 08:33:43 compute-1 sudo[156594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:43 compute-1 python3.9[156596]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:43 compute-1 sudo[156594]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:44 compute-1 sudo[156746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gooyjxsrenskjnkuxsjcdfxjdxpsxxyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416423.7380373-2410-103808338723054/AnsiballZ_stat.py'
Jan 26 08:33:44 compute-1 sudo[156746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:44 compute-1 python3.9[156748]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:44 compute-1 sudo[156746]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:44 compute-1 sudo[156871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kodzoknwwxvdouoxlkbeegweetzleofx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416423.7380373-2410-103808338723054/AnsiballZ_copy.py'
Jan 26 08:33:44 compute-1 sudo[156871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:44 compute-1 python3.9[156873]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416423.7380373-2410-103808338723054/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:44 compute-1 sudo[156871]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:45 compute-1 sudo[157023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lieewadzndwyiuabubwhkkbgpmfqrxwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416425.1682773-2440-267963662223631/AnsiballZ_stat.py'
Jan 26 08:33:45 compute-1 sudo[157023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:45 compute-1 python3.9[157025]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:45 compute-1 sudo[157023]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:45 compute-1 sudo[157101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqvpajgphommyveqaghctvxzbptvmzpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416425.1682773-2440-267963662223631/AnsiballZ_file.py'
Jan 26 08:33:45 compute-1 sudo[157101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:46 compute-1 python3.9[157103]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:46 compute-1 sudo[157101]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:46 compute-1 sudo[157253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cijsziyhekovobdlsnytaztmlsoihpmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416426.4214275-2464-155275985633618/AnsiballZ_stat.py'
Jan 26 08:33:46 compute-1 sudo[157253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:46 compute-1 python3.9[157255]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:47 compute-1 sudo[157253]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:47 compute-1 sudo[157331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epojetbkeecrjtdtccjunyeqaikbjhrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416426.4214275-2464-155275985633618/AnsiballZ_file.py'
Jan 26 08:33:47 compute-1 sudo[157331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:47 compute-1 python3.9[157333]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:47 compute-1 sudo[157331]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:48 compute-1 sudo[157483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uodfedfgkwinauqmzmynllhepxlysdlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416427.7286735-2489-182036991100716/AnsiballZ_stat.py'
Jan 26 08:33:48 compute-1 sudo[157483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:48 compute-1 python3.9[157485]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:48 compute-1 sudo[157483]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:48 compute-1 sudo[157608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxlyqbfmipukjikmdphsaagyvrlciykg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416427.7286735-2489-182036991100716/AnsiballZ_copy.py'
Jan 26 08:33:48 compute-1 sudo[157608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:48 compute-1 python3.9[157610]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416427.7286735-2489-182036991100716/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:49 compute-1 sudo[157608]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:49 compute-1 sudo[157761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbbfdqptrcrylqqqykxbxnshusnrxvcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416429.2196727-2518-31666446272317/AnsiballZ_file.py'
Jan 26 08:33:49 compute-1 sudo[157761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:49 compute-1 python3.9[157763]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:49 compute-1 sudo[157761]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:50 compute-1 sudo[157913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgpfrbradszuapidukelrxhowoizfqin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416429.8859215-2534-25529343588799/AnsiballZ_command.py'
Jan 26 08:33:50 compute-1 sudo[157913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:50 compute-1 python3.9[157915]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:33:50 compute-1 sudo[157913]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:51 compute-1 sudo[158068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhtrmjpzovmkwzzuejqdqtmxvhtgpxjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416430.6448421-2550-44735718556049/AnsiballZ_blockinfile.py'
Jan 26 08:33:51 compute-1 sudo[158068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:51 compute-1 python3.9[158070]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:51 compute-1 sudo[158068]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:52 compute-1 sudo[158220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxfkuhmdwknmaezvnuuwzhssmpclcgui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416431.6439047-2568-242328344330612/AnsiballZ_command.py'
Jan 26 08:33:52 compute-1 sudo[158220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:52 compute-1 python3.9[158222]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:33:52 compute-1 sudo[158220]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:52 compute-1 sudo[158373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukaonuvdhnhwetyflbandbyyibnqcjku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416432.4555638-2584-252412937421661/AnsiballZ_stat.py'
Jan 26 08:33:52 compute-1 sudo[158373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:53 compute-1 python3.9[158375]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:33:53 compute-1 sudo[158373]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:53 compute-1 sudo[158527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpspxhleyswxxtlgepejkugajwzlnawt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416433.2786171-2600-139357297462012/AnsiballZ_command.py'
Jan 26 08:33:53 compute-1 sudo[158527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:53 compute-1 python3.9[158529]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:33:53 compute-1 sudo[158527]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:54 compute-1 sudo[158682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjdjiliidxjjnpdvtiebwoufvnuacwpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416434.1380575-2616-263459810400893/AnsiballZ_file.py'
Jan 26 08:33:54 compute-1 sudo[158682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:54 compute-1 python3.9[158684]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:54 compute-1 sudo[158682]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:55 compute-1 sudo[158834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgiybiyiydvmwllztmllpimzegflxbes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416434.9140713-2632-34532145128990/AnsiballZ_stat.py'
Jan 26 08:33:55 compute-1 sudo[158834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:55 compute-1 python3.9[158836]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:55 compute-1 sudo[158834]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:55 compute-1 sudo[158957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxbdzxujclepywxrdhsmydrxzvasbvqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416434.9140713-2632-34532145128990/AnsiballZ_copy.py'
Jan 26 08:33:55 compute-1 sudo[158957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:56 compute-1 python3.9[158959]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416434.9140713-2632-34532145128990/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:56 compute-1 sudo[158957]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:56 compute-1 sudo[159109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rivmujyblirlygwgniuytnkuvjuekkjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416436.2848465-2662-74976287468254/AnsiballZ_stat.py'
Jan 26 08:33:56 compute-1 sudo[159109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:56 compute-1 python3.9[159111]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:56 compute-1 sudo[159109]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:57 compute-1 sudo[159232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyqsivmkhaluzkidgreoymqbpjoqmsck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416436.2848465-2662-74976287468254/AnsiballZ_copy.py'
Jan 26 08:33:57 compute-1 sudo[159232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:57 compute-1 python3.9[159234]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416436.2848465-2662-74976287468254/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:57 compute-1 sudo[159232]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:57 compute-1 sudo[159384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwjzsyaksutrmkokupdjjxxtkjnyyakg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416437.610204-2692-225721408503195/AnsiballZ_stat.py'
Jan 26 08:33:57 compute-1 sudo[159384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:58 compute-1 python3.9[159386]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:33:58 compute-1 sudo[159384]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:58 compute-1 sudo[159507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohlgtfkiczqvaiyzylckzwgkkndtwhrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416437.610204-2692-225721408503195/AnsiballZ_copy.py'
Jan 26 08:33:58 compute-1 sudo[159507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:58 compute-1 python3.9[159509]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416437.610204-2692-225721408503195/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:33:58 compute-1 sudo[159507]: pam_unix(sudo:session): session closed for user root
Jan 26 08:33:59 compute-1 sudo[159659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqzkntefkqmzgsthtqsdhwahvzmhokwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416439.0650492-2723-89710837208062/AnsiballZ_systemd.py'
Jan 26 08:33:59 compute-1 sudo[159659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:33:59 compute-1 python3.9[159661]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:33:59 compute-1 systemd[1]: Reloading.
Jan 26 08:33:59 compute-1 systemd-rc-local-generator[159685]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:33:59 compute-1 systemd-sysv-generator[159693]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:34:00 compute-1 systemd[1]: Reached target edpm_libvirt.target.
Jan 26 08:34:00 compute-1 sudo[159659]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:00 compute-1 sudo[159850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvwwohjppeayidszfzkyvkyqpdasspby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416440.389493-2738-46880582026526/AnsiballZ_systemd.py'
Jan 26 08:34:00 compute-1 sudo[159850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:01 compute-1 python3.9[159852]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 26 08:34:01 compute-1 systemd[1]: Reloading.
Jan 26 08:34:01 compute-1 systemd-sysv-generator[159885]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:34:01 compute-1 systemd-rc-local-generator[159881]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:34:01 compute-1 systemd[1]: Reloading.
Jan 26 08:34:01 compute-1 systemd-rc-local-generator[159917]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:34:01 compute-1 systemd-sysv-generator[159922]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:34:01 compute-1 sudo[159850]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:02 compute-1 sshd-session[105181]: Connection closed by 192.168.122.30 port 60880
Jan 26 08:34:02 compute-1 sshd-session[105178]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:34:02 compute-1 systemd[1]: session-24.scope: Deactivated successfully.
Jan 26 08:34:02 compute-1 systemd[1]: session-24.scope: Consumed 4min 499ms CPU time.
Jan 26 08:34:02 compute-1 systemd-logind[788]: Session 24 logged out. Waiting for processes to exit.
Jan 26 08:34:02 compute-1 systemd-logind[788]: Removed session 24.
Jan 26 08:34:04 compute-1 podman[159950]: 2026-01-26 08:34:04.933277819 +0000 UTC m=+0.173776951 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 26 08:34:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:34:05.283 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:34:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:34:05.285 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:34:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:34:05.285 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:34:06 compute-1 sshd-session[159977]: Connection closed by authenticating user root 159.223.236.81 port 51602 [preauth]
Jan 26 08:34:07 compute-1 sshd-session[159979]: Accepted publickey for zuul from 192.168.122.30 port 52888 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:34:07 compute-1 systemd-logind[788]: New session 25 of user zuul.
Jan 26 08:34:07 compute-1 systemd[1]: Started Session 25 of User zuul.
Jan 26 08:34:07 compute-1 sshd-session[159979]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:34:08 compute-1 python3.9[160132]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:34:10 compute-1 podman[160260]: 2026-01-26 08:34:10.131521764 +0000 UTC m=+0.046156448 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:34:10 compute-1 python3.9[160300]: ansible-ansible.builtin.service_facts Invoked
Jan 26 08:34:10 compute-1 network[160319]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 08:34:10 compute-1 network[160320]: 'network-scripts' will be removed from distribution in near future.
Jan 26 08:34:10 compute-1 network[160321]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 08:34:16 compute-1 sudo[160590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jayvavvzlanpqqewbubzrbrkgifaylrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416455.6030176-70-279225456137242/AnsiballZ_setup.py'
Jan 26 08:34:16 compute-1 sudo[160590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:16 compute-1 python3.9[160592]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 08:34:16 compute-1 sudo[160590]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:17 compute-1 sudo[160674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjqqofeowdlowvsmvmmjscxcwkeqxebo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416455.6030176-70-279225456137242/AnsiballZ_dnf.py'
Jan 26 08:34:17 compute-1 sudo[160674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:17 compute-1 python3.9[160676]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 08:34:22 compute-1 sudo[160674]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:23 compute-1 sudo[160827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysqefyspgjsqlpsvliiohbwbcgfxddzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416462.652861-94-174722985301826/AnsiballZ_stat.py'
Jan 26 08:34:23 compute-1 sudo[160827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:23 compute-1 python3.9[160829]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:34:23 compute-1 sudo[160827]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:24 compute-1 sudo[160979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmdsxeugfurovijrasfxpjwxcyecngzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416463.6242938-114-200871308921814/AnsiballZ_command.py'
Jan 26 08:34:24 compute-1 sudo[160979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:24 compute-1 python3.9[160981]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:34:24 compute-1 sudo[160979]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:25 compute-1 sudo[161132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqvnwdtelwqhvsabqpqfdemujdurqffg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416464.6318839-134-48420399932285/AnsiballZ_stat.py'
Jan 26 08:34:25 compute-1 sudo[161132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:25 compute-1 python3.9[161134]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:34:25 compute-1 sudo[161132]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:25 compute-1 sudo[161284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgwvxheatiprbypebnlmzmgyjeizcrmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416465.4523346-150-13009114042567/AnsiballZ_command.py'
Jan 26 08:34:25 compute-1 sudo[161284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:26 compute-1 python3.9[161286]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:34:26 compute-1 sudo[161284]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:26 compute-1 sudo[161437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtenmuogqojwjngjzpzdefvwvaewpxqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416466.252821-166-220926541606801/AnsiballZ_stat.py'
Jan 26 08:34:26 compute-1 sudo[161437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:26 compute-1 python3.9[161439]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:34:26 compute-1 sudo[161437]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:27 compute-1 sudo[161560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynudpctthhbhblahtjrboqwheqrtaiok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416466.252821-166-220926541606801/AnsiballZ_copy.py'
Jan 26 08:34:27 compute-1 sudo[161560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:27 compute-1 python3.9[161562]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416466.252821-166-220926541606801/.source.iscsi _original_basename=.n7qga7dy follow=False checksum=54915525b5a3801d1d8385a880b1e96e4b5dc93c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:34:27 compute-1 sudo[161560]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:28 compute-1 sudo[161712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuyfqjeguqvfbqivaxswirivgyezjomj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416467.7628064-196-148581140496146/AnsiballZ_file.py'
Jan 26 08:34:28 compute-1 sudo[161712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:28 compute-1 python3.9[161714]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:34:28 compute-1 sudo[161712]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:29 compute-1 sudo[161864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhdlzwftdfxxnchogvnyxuggkivopwjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416468.6832294-212-132282795864326/AnsiballZ_lineinfile.py'
Jan 26 08:34:29 compute-1 sudo[161864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:29 compute-1 python3.9[161866]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:34:29 compute-1 sudo[161864]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:30 compute-1 sudo[162016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egyzwartmldzmuciottbmsszyfptrrsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416469.679968-230-118433018203822/AnsiballZ_systemd_service.py'
Jan 26 08:34:30 compute-1 sudo[162016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:30 compute-1 python3.9[162018]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:34:30 compute-1 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 26 08:34:30 compute-1 sudo[162016]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:31 compute-1 sudo[162172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcukgwciqajniqstusrzjtdabpkkcdni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416471.0528753-246-60819651189465/AnsiballZ_systemd_service.py'
Jan 26 08:34:31 compute-1 sudo[162172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:31 compute-1 python3.9[162174]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:34:31 compute-1 systemd[1]: Reloading.
Jan 26 08:34:31 compute-1 systemd-rc-local-generator[162199]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:34:31 compute-1 systemd-sysv-generator[162205]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:34:32 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 26 08:34:32 compute-1 systemd[1]: Starting Open-iSCSI...
Jan 26 08:34:32 compute-1 kernel: Loading iSCSI transport class v2.0-870.
Jan 26 08:34:32 compute-1 systemd[1]: Started Open-iSCSI.
Jan 26 08:34:32 compute-1 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 26 08:34:32 compute-1 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 26 08:34:32 compute-1 sudo[162172]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:33 compute-1 python3.9[162374]: ansible-ansible.builtin.service_facts Invoked
Jan 26 08:34:33 compute-1 network[162391]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 08:34:33 compute-1 network[162392]: 'network-scripts' will be removed from distribution in near future.
Jan 26 08:34:33 compute-1 network[162393]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 08:34:35 compute-1 podman[162433]: 2026-01-26 08:34:35.15951067 +0000 UTC m=+0.153740331 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 26 08:34:37 compute-1 sudo[162690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjedebuqrkitzzuldbrwjysbnminfkfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416477.6409597-292-240573177628063/AnsiballZ_dnf.py'
Jan 26 08:34:37 compute-1 sudo[162690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:38 compute-1 python3.9[162692]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 08:34:40 compute-1 podman[162700]: 2026-01-26 08:34:40.459701574 +0000 UTC m=+0.079407590 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:34:40 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 08:34:40 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 08:34:40 compute-1 systemd[1]: Reloading.
Jan 26 08:34:40 compute-1 systemd-rc-local-generator[162754]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:34:40 compute-1 systemd-sysv-generator[162760]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:34:41 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 08:34:41 compute-1 sudo[162690]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:41 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 08:34:41 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 08:34:41 compute-1 systemd[1]: run-r98c3697537a3473eacf0ba1414536c16.service: Deactivated successfully.
Jan 26 08:34:42 compute-1 sudo[163024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjfoeafntqejkmaddkwfddlqlynfztzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416481.982838-310-71267804612767/AnsiballZ_file.py'
Jan 26 08:34:42 compute-1 sudo[163024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:42 compute-1 python3.9[163026]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 26 08:34:42 compute-1 sudo[163024]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:43 compute-1 sudo[163176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfyfudvlsqzgsnlhefdirybnimaaitzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416482.7168424-326-254144943425392/AnsiballZ_modprobe.py'
Jan 26 08:34:43 compute-1 sudo[163176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:43 compute-1 python3.9[163178]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 26 08:34:43 compute-1 sudo[163176]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:43 compute-1 sudo[163332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpozujrbaeitfaskxqifgphvwtypmnwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416483.5953152-342-85649276018326/AnsiballZ_stat.py'
Jan 26 08:34:43 compute-1 sudo[163332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:44 compute-1 python3.9[163334]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:34:44 compute-1 sudo[163332]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:44 compute-1 sudo[163455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckzdnulxmkiafdeyhxaqzuaxpqdwghui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416483.5953152-342-85649276018326/AnsiballZ_copy.py'
Jan 26 08:34:44 compute-1 sudo[163455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:44 compute-1 python3.9[163457]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416483.5953152-342-85649276018326/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:34:44 compute-1 sudo[163455]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:45 compute-1 sudo[163607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfvaufuoxrpzennwedqwjjdigtnhyoxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416485.4049463-374-259065463980705/AnsiballZ_lineinfile.py'
Jan 26 08:34:45 compute-1 sudo[163607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:46 compute-1 python3.9[163609]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:34:46 compute-1 sudo[163607]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:47 compute-1 sudo[163760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfxmqhvyavjinlnhzacvssxqatxalipn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416486.262476-390-31872400405071/AnsiballZ_systemd.py'
Jan 26 08:34:47 compute-1 sudo[163760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:47 compute-1 python3.9[163762]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 08:34:47 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 26 08:34:47 compute-1 systemd[1]: Stopped Load Kernel Modules.
Jan 26 08:34:47 compute-1 systemd[1]: Stopping Load Kernel Modules...
Jan 26 08:34:47 compute-1 systemd[1]: Starting Load Kernel Modules...
Jan 26 08:34:47 compute-1 systemd[1]: Finished Load Kernel Modules.
Jan 26 08:34:47 compute-1 sudo[163760]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:48 compute-1 sudo[163916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijusiosclveumptdabjgkniuiaeswtgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416488.2012067-406-203568202951021/AnsiballZ_command.py'
Jan 26 08:34:48 compute-1 sudo[163916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:48 compute-1 python3.9[163918]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:34:48 compute-1 sudo[163916]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:49 compute-1 sudo[164069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnbbdzuilmonfvexrgozhefhrdhhaokz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416489.408311-426-182100366192016/AnsiballZ_stat.py'
Jan 26 08:34:49 compute-1 sudo[164069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:49 compute-1 python3.9[164071]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:34:50 compute-1 sudo[164069]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:50 compute-1 sudo[164221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhydtevhqdfrmqkpvngeoeitkebbgvly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416490.285144-444-240168864262798/AnsiballZ_stat.py'
Jan 26 08:34:50 compute-1 sudo[164221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:50 compute-1 python3.9[164223]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:34:50 compute-1 sudo[164221]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:51 compute-1 sudo[164344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nctjqqvhanamifxnhaopbqdyspfggyqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416490.285144-444-240168864262798/AnsiballZ_copy.py'
Jan 26 08:34:51 compute-1 sudo[164344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:51 compute-1 python3.9[164346]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416490.285144-444-240168864262798/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:34:51 compute-1 sudo[164344]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:52 compute-1 sudo[164496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pglabhzsiuplqfdcwbqkusmmmxgodvsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416491.7058976-475-190963113213607/AnsiballZ_command.py'
Jan 26 08:34:52 compute-1 sudo[164496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:52 compute-1 python3.9[164498]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:34:52 compute-1 sudo[164496]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:52 compute-1 sudo[164649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueuhfhzvllmznftluuhjrslbqpjwygtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416492.4498913-490-57410617023035/AnsiballZ_lineinfile.py'
Jan 26 08:34:52 compute-1 sudo[164649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:52 compute-1 python3.9[164651]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:34:53 compute-1 sudo[164649]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:53 compute-1 sudo[164801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odfcevfrepvoegtmcxvotqkrfwcxyqnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416493.2215073-506-58096952464950/AnsiballZ_replace.py'
Jan 26 08:34:53 compute-1 sudo[164801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:53 compute-1 python3.9[164803]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:34:53 compute-1 sudo[164801]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:54 compute-1 sudo[164953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oblnmblpyftvubuwzarzpklxgkjqisjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416494.1100342-522-262388962265802/AnsiballZ_replace.py'
Jan 26 08:34:54 compute-1 sudo[164953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:54 compute-1 python3.9[164955]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:34:54 compute-1 sudo[164953]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:55 compute-1 sudo[165105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbvsohetiysxjsljkmycdegfagxnxuyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416494.9103487-540-216031277900875/AnsiballZ_lineinfile.py'
Jan 26 08:34:55 compute-1 sudo[165105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:55 compute-1 python3.9[165107]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:34:55 compute-1 sudo[165105]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:56 compute-1 sudo[165257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tevahfnurofbatstbyyqtnexncjndzqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416495.679965-540-248553656534004/AnsiballZ_lineinfile.py'
Jan 26 08:34:56 compute-1 sudo[165257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:56 compute-1 python3.9[165259]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:34:56 compute-1 sudo[165257]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:56 compute-1 sudo[165409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcmjttjpgdyfionzgtayfjaojvnlcyox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416496.4322617-540-180576827452391/AnsiballZ_lineinfile.py'
Jan 26 08:34:56 compute-1 sudo[165409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:56 compute-1 python3.9[165411]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:34:56 compute-1 sudo[165409]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:57 compute-1 sudo[165561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvnxvllqtyyrndnpdcwgltoxncqtkhba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416497.1612911-540-178859871746495/AnsiballZ_lineinfile.py'
Jan 26 08:34:57 compute-1 sudo[165561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:57 compute-1 python3.9[165563]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:34:57 compute-1 sudo[165561]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:58 compute-1 sudo[165713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpuunroaszqcslqqoekswavjtqfrcyqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416497.88247-598-75928467895100/AnsiballZ_stat.py'
Jan 26 08:34:58 compute-1 sudo[165713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:58 compute-1 python3.9[165715]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:34:58 compute-1 sudo[165713]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:59 compute-1 sudo[165867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kngqmkwuwbfifyltypgoanncqhklyucw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416498.7266252-614-16679327268450/AnsiballZ_command.py'
Jan 26 08:34:59 compute-1 sudo[165867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:34:59 compute-1 python3.9[165869]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:34:59 compute-1 sudo[165867]: pam_unix(sudo:session): session closed for user root
Jan 26 08:34:59 compute-1 sudo[166020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-styxkryduoqvheqwyvknpignghwexafa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416499.531605-632-13272309182209/AnsiballZ_systemd_service.py'
Jan 26 08:34:59 compute-1 sudo[166020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:00 compute-1 python3.9[166022]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:35:00 compute-1 systemd[1]: Listening on multipathd control socket.
Jan 26 08:35:00 compute-1 sudo[166020]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:00 compute-1 sudo[166176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oriqesblcxlnfxwewlhmrbkrdytokrys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416500.5846677-648-202872160371622/AnsiballZ_systemd_service.py'
Jan 26 08:35:00 compute-1 sudo[166176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:01 compute-1 python3.9[166178]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:35:01 compute-1 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 26 08:35:01 compute-1 udevadm[166183]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 26 08:35:01 compute-1 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 26 08:35:01 compute-1 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 26 08:35:01 compute-1 multipathd[166187]: --------start up--------
Jan 26 08:35:01 compute-1 multipathd[166187]: read /etc/multipath.conf
Jan 26 08:35:01 compute-1 multipathd[166187]: path checkers start up
Jan 26 08:35:01 compute-1 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 26 08:35:01 compute-1 sudo[166176]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:02 compute-1 sudo[166344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iewixubnismhlkuanujrcmmhhaqrumgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416502.0144172-673-84495400286561/AnsiballZ_file.py'
Jan 26 08:35:02 compute-1 sudo[166344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:02 compute-1 python3.9[166346]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 26 08:35:02 compute-1 sudo[166344]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:03 compute-1 sudo[166496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcvhhzwafwlhygkcgjnkscdqcblppdyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416502.8709254-688-78625705809181/AnsiballZ_modprobe.py'
Jan 26 08:35:03 compute-1 sudo[166496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:03 compute-1 python3.9[166498]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 26 08:35:03 compute-1 kernel: Key type psk registered
Jan 26 08:35:03 compute-1 sudo[166496]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:03 compute-1 sudo[166659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arqmybfkxnureemriivqftcyjfulwzpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416503.6567786-704-79181331548908/AnsiballZ_stat.py'
Jan 26 08:35:03 compute-1 sudo[166659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:04 compute-1 python3.9[166661]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:35:04 compute-1 sudo[166659]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:04 compute-1 sudo[166782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jewpwqebepcblwdjewzjesxzseqnuoay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416503.6567786-704-79181331548908/AnsiballZ_copy.py'
Jan 26 08:35:04 compute-1 sudo[166782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:04 compute-1 python3.9[166784]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416503.6567786-704-79181331548908/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:04 compute-1 sudo[166782]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:35:05.285 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:35:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:35:05.287 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:35:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:35:05.287 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:35:05 compute-1 sudo[166944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqjhkjwpuyvmbeianqdrkcmccozzpany ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416505.0675867-736-4390653492539/AnsiballZ_lineinfile.py'
Jan 26 08:35:05 compute-1 sudo[166944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:05 compute-1 podman[166908]: 2026-01-26 08:35:05.483439373 +0000 UTC m=+0.176276000 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:35:05 compute-1 python3.9[166953]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:05 compute-1 sudo[166944]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:06 compute-1 sudo[167113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xttmlpnfzdltebiuvoukxmoldjqmlnli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416505.8621216-753-233118234950527/AnsiballZ_systemd.py'
Jan 26 08:35:06 compute-1 sudo[167113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:06 compute-1 python3.9[167115]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 08:35:06 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 26 08:35:06 compute-1 systemd[1]: Stopped Load Kernel Modules.
Jan 26 08:35:06 compute-1 systemd[1]: Stopping Load Kernel Modules...
Jan 26 08:35:06 compute-1 systemd[1]: Starting Load Kernel Modules...
Jan 26 08:35:06 compute-1 systemd[1]: Finished Load Kernel Modules.
Jan 26 08:35:06 compute-1 sudo[167113]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:07 compute-1 sudo[167269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oofqexujphrvslzyaohmemjfwpdfzoop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416506.8396957-768-165601731637556/AnsiballZ_dnf.py'
Jan 26 08:35:07 compute-1 sudo[167269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:07 compute-1 python3.9[167271]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 08:35:09 compute-1 systemd[1]: Reloading.
Jan 26 08:35:09 compute-1 systemd-rc-local-generator[167301]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:35:09 compute-1 systemd-sysv-generator[167307]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:35:10 compute-1 systemd[1]: Reloading.
Jan 26 08:35:10 compute-1 systemd-rc-local-generator[167339]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:35:10 compute-1 systemd-sysv-generator[167344]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:35:10 compute-1 virtproxyd[154148]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 26 08:35:10 compute-1 virtproxyd[154148]: hostname: compute-1
Jan 26 08:35:10 compute-1 virtproxyd[154148]: nl_recv returned with error: No buffer space available
Jan 26 08:35:10 compute-1 systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 26 08:35:10 compute-1 systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 26 08:35:10 compute-1 podman[167383]: 2026-01-26 08:35:10.614821332 +0000 UTC m=+0.085524968 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:35:10 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 08:35:10 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 08:35:10 compute-1 systemd[1]: Reloading.
Jan 26 08:35:10 compute-1 systemd-rc-local-generator[167451]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:35:10 compute-1 systemd-sysv-generator[167456]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:35:10 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 08:35:11 compute-1 sudo[167269]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:12 compute-1 sudo[168750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiquemabkqqcbryrqbvxkrdqkatqghfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416511.779917-784-76163845969222/AnsiballZ_systemd_service.py'
Jan 26 08:35:12 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 08:35:12 compute-1 sudo[168750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:12 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 08:35:12 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.598s CPU time.
Jan 26 08:35:12 compute-1 systemd[1]: run-r5404be1d1e50474c926a6b6c0b54f723.service: Deactivated successfully.
Jan 26 08:35:12 compute-1 python3.9[168753]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 08:35:12 compute-1 systemd[1]: Stopping Open-iSCSI...
Jan 26 08:35:12 compute-1 iscsid[162214]: iscsid shutting down.
Jan 26 08:35:12 compute-1 systemd[1]: iscsid.service: Deactivated successfully.
Jan 26 08:35:12 compute-1 systemd[1]: Stopped Open-iSCSI.
Jan 26 08:35:12 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 26 08:35:12 compute-1 systemd[1]: Starting Open-iSCSI...
Jan 26 08:35:12 compute-1 systemd[1]: Started Open-iSCSI.
Jan 26 08:35:12 compute-1 sudo[168750]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:13 compute-1 sudo[168907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvvfcxezpqfyslwzazmgwmuuflwzesqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416512.770266-800-131582620153298/AnsiballZ_systemd_service.py'
Jan 26 08:35:13 compute-1 sudo[168907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:13 compute-1 python3.9[168909]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 08:35:13 compute-1 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 26 08:35:13 compute-1 multipathd[166187]: exit (signal)
Jan 26 08:35:13 compute-1 multipathd[166187]: --------shut down-------
Jan 26 08:35:13 compute-1 systemd[1]: multipathd.service: Deactivated successfully.
Jan 26 08:35:13 compute-1 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 26 08:35:13 compute-1 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 26 08:35:13 compute-1 multipathd[168917]: --------start up--------
Jan 26 08:35:13 compute-1 multipathd[168917]: read /etc/multipath.conf
Jan 26 08:35:13 compute-1 multipathd[168917]: path checkers start up
Jan 26 08:35:13 compute-1 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 26 08:35:13 compute-1 sudo[168907]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:14 compute-1 sshd-session[168910]: Connection closed by authenticating user root 159.223.236.81 port 36488 [preauth]
Jan 26 08:35:14 compute-1 python3.9[169074]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:35:15 compute-1 sudo[169228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqfmhlncuoxaxllczkwvutxdlchfgyel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416515.127198-835-114425644621424/AnsiballZ_file.py'
Jan 26 08:35:15 compute-1 sudo[169228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:15 compute-1 python3.9[169230]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:15 compute-1 sudo[169228]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:16 compute-1 sudo[169380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyiwmxrisdovlmsqhcspmzeogwjjdhan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416516.2614975-857-50014061062639/AnsiballZ_systemd_service.py'
Jan 26 08:35:16 compute-1 sudo[169380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:16 compute-1 python3.9[169382]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 08:35:16 compute-1 systemd[1]: Reloading.
Jan 26 08:35:17 compute-1 systemd-rc-local-generator[169411]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:35:17 compute-1 systemd-sysv-generator[169414]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:35:17 compute-1 sudo[169380]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:18 compute-1 python3.9[169568]: ansible-ansible.builtin.service_facts Invoked
Jan 26 08:35:18 compute-1 network[169585]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 08:35:18 compute-1 network[169586]: 'network-scripts' will be removed from distribution in near future.
Jan 26 08:35:18 compute-1 network[169587]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 08:35:24 compute-1 sudo[169857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpllgbutxbbpsjbnreulcoiyfqdtpjfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416523.869299-895-161037917546079/AnsiballZ_systemd_service.py'
Jan 26 08:35:24 compute-1 sudo[169857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:24 compute-1 python3.9[169859]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:35:24 compute-1 sudo[169857]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:25 compute-1 sudo[170010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzuvegnmnwgpfotgdamrpyfptometvtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416524.7080245-895-112529098532107/AnsiballZ_systemd_service.py'
Jan 26 08:35:25 compute-1 sudo[170010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:25 compute-1 python3.9[170012]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:35:25 compute-1 sudo[170010]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:26 compute-1 sudo[170163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gowbcxxronqeradmsgzazpyrpnultcdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416525.8129447-895-96428167072030/AnsiballZ_systemd_service.py'
Jan 26 08:35:26 compute-1 sudo[170163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:26 compute-1 python3.9[170165]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:35:26 compute-1 sudo[170163]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:27 compute-1 sudo[170316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjfobeaqzerxqvjqgmklcrbuzhcxuxgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416526.7123835-895-233828124831767/AnsiballZ_systemd_service.py'
Jan 26 08:35:27 compute-1 sudo[170316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:27 compute-1 python3.9[170318]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:35:27 compute-1 sudo[170316]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:27 compute-1 sudo[170469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjzbvyrkzkhkyntxzycetfgflayzvnmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416527.5637448-895-66089818908893/AnsiballZ_systemd_service.py'
Jan 26 08:35:27 compute-1 sudo[170469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:28 compute-1 python3.9[170471]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:35:28 compute-1 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 26 08:35:29 compute-1 sudo[170469]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:29 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 26 08:35:29 compute-1 sudo[170624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfzwmvhttdvnvliryunleavyzepweoyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416529.4776158-895-18241521745600/AnsiballZ_systemd_service.py'
Jan 26 08:35:29 compute-1 sudo[170624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:30 compute-1 python3.9[170626]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:35:30 compute-1 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 26 08:35:31 compute-1 sudo[170624]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:31 compute-1 sudo[170778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njplfyofhqqrbamvoqpqkxbsqsfeabsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416531.4009032-895-221388872305545/AnsiballZ_systemd_service.py'
Jan 26 08:35:31 compute-1 sudo[170778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:31 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 26 08:35:32 compute-1 python3.9[170780]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:35:32 compute-1 sudo[170778]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:32 compute-1 sudo[170932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egtpqhzbcsjwifftslrcmanrndmcdmrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416532.3759758-895-79716652506302/AnsiballZ_systemd_service.py'
Jan 26 08:35:32 compute-1 sudo[170932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:33 compute-1 python3.9[170934]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:35:33 compute-1 sudo[170932]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:33 compute-1 sudo[171085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inkitwmzthyryhpurzsqpbrtxnqildue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416533.42568-1013-11900009715112/AnsiballZ_file.py'
Jan 26 08:35:33 compute-1 sudo[171085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:33 compute-1 python3.9[171087]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:33 compute-1 sudo[171085]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:34 compute-1 sudo[171237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlidxtffawckbdthowerrtuyprnpxjfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416534.146076-1013-69571516522891/AnsiballZ_file.py'
Jan 26 08:35:34 compute-1 sudo[171237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:34 compute-1 python3.9[171239]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:34 compute-1 sudo[171237]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:35 compute-1 sudo[171389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brjkbatemhvsbkbphlaziliceannmcbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416534.8479884-1013-275560071885924/AnsiballZ_file.py'
Jan 26 08:35:35 compute-1 sudo[171389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:35 compute-1 python3.9[171391]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:35 compute-1 sudo[171389]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:35 compute-1 sudo[171563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxhoonunytnpbmhpokarmyquktgtaohx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416535.532441-1013-81778734102219/AnsiballZ_file.py'
Jan 26 08:35:35 compute-1 sudo[171563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:35 compute-1 podman[171491]: 2026-01-26 08:35:35.907269656 +0000 UTC m=+0.163797228 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 08:35:36 compute-1 python3.9[171566]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:36 compute-1 sudo[171563]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:36 compute-1 sudo[171720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjfykfeilniqgzzvzzgfyfyxhyeectpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416536.201512-1013-11805245849270/AnsiballZ_file.py'
Jan 26 08:35:36 compute-1 sudo[171720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:36 compute-1 python3.9[171722]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:36 compute-1 sudo[171720]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:37 compute-1 sudo[171872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrabuxdtkvmpfclsbkankjjmqyesgbqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416536.908422-1013-146917561914534/AnsiballZ_file.py'
Jan 26 08:35:37 compute-1 sudo[171872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:37 compute-1 python3.9[171874]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:37 compute-1 sudo[171872]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:37 compute-1 sudo[172024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyetzkfewhnsjruoxbuiofgzbljnnbox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416537.6270676-1013-37439390854410/AnsiballZ_file.py'
Jan 26 08:35:37 compute-1 sudo[172024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:38 compute-1 python3.9[172026]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:38 compute-1 sudo[172024]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:38 compute-1 sudo[172176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhevofovmugbzobvrxbfqyiblpzqtnff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416538.3395743-1013-118129634288712/AnsiballZ_file.py'
Jan 26 08:35:38 compute-1 sudo[172176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:38 compute-1 python3.9[172178]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:38 compute-1 sudo[172176]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:39 compute-1 sudo[172328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohcxegoxukmplozgoejrtnvejhxnzmyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416539.0953562-1127-79373386525310/AnsiballZ_file.py'
Jan 26 08:35:39 compute-1 sudo[172328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:39 compute-1 python3.9[172330]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:39 compute-1 sudo[172328]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:40 compute-1 sudo[172480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpsguqkqeqcymyhioiidtskabibsyuuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416539.8560436-1127-151210091354560/AnsiballZ_file.py'
Jan 26 08:35:40 compute-1 sudo[172480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:40 compute-1 python3.9[172482]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:40 compute-1 sudo[172480]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:40 compute-1 podman[172559]: 2026-01-26 08:35:40.826709748 +0000 UTC m=+0.074128506 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 26 08:35:40 compute-1 sudo[172652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dineppcgjijkpqaykwrvxtnyqlweyiyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416540.6166027-1127-255489246627299/AnsiballZ_file.py'
Jan 26 08:35:40 compute-1 sudo[172652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:41 compute-1 python3.9[172654]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:41 compute-1 sudo[172652]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:41 compute-1 sudo[172804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbxvvdvtbnfaainonfmlknjcksbgeysm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416541.326911-1127-108852539837033/AnsiballZ_file.py'
Jan 26 08:35:41 compute-1 sudo[172804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:41 compute-1 python3.9[172806]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:41 compute-1 sudo[172804]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:42 compute-1 sudo[172956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzpramjoofkcszdiyptnodphapsizjjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416542.0380073-1127-266539066170817/AnsiballZ_file.py'
Jan 26 08:35:42 compute-1 sudo[172956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:42 compute-1 python3.9[172958]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:42 compute-1 sudo[172956]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:43 compute-1 sudo[173108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajrcjkgivpydtriktlyalrwbkavrdbfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416542.718843-1127-275706974566854/AnsiballZ_file.py'
Jan 26 08:35:43 compute-1 sudo[173108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:43 compute-1 python3.9[173110]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:43 compute-1 sudo[173108]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:43 compute-1 sudo[173260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkanmecplmosxpukmqkpuekwzjngigrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416543.4012742-1127-257250049662980/AnsiballZ_file.py'
Jan 26 08:35:43 compute-1 sudo[173260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:43 compute-1 python3.9[173262]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:43 compute-1 sudo[173260]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:44 compute-1 sudo[173412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgkfrapmpefnmuucemogyrnmsnwenkse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416544.0632625-1127-219724122984261/AnsiballZ_file.py'
Jan 26 08:35:44 compute-1 sudo[173412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:44 compute-1 python3.9[173414]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:35:44 compute-1 sudo[173412]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:45 compute-1 sudo[173564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fthkarduxxnqnepzzhesexkskpbztffu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416545.1432035-1243-181456125075968/AnsiballZ_command.py'
Jan 26 08:35:45 compute-1 sudo[173564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:45 compute-1 python3.9[173566]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:35:45 compute-1 sudo[173564]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:46 compute-1 python3.9[173718]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 08:35:48 compute-1 sudo[173868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-barozmfdfucpiyltlmqctsuqjgbumiio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416547.8302865-1279-134406145110217/AnsiballZ_systemd_service.py'
Jan 26 08:35:48 compute-1 sudo[173868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:48 compute-1 python3.9[173870]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 08:35:48 compute-1 systemd[1]: Reloading.
Jan 26 08:35:48 compute-1 systemd-rc-local-generator[173899]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:35:48 compute-1 systemd-sysv-generator[173903]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:35:48 compute-1 sudo[173868]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:49 compute-1 sudo[174055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmkkogavftrfldhxmjfrazdnqkgajecq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416549.1018393-1295-55704385066693/AnsiballZ_command.py'
Jan 26 08:35:49 compute-1 sudo[174055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:49 compute-1 python3.9[174057]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:35:49 compute-1 sudo[174055]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:50 compute-1 sudo[174208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvincyvywegfvbivpvjuzunvcdipbzgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416549.8731961-1295-245911549832788/AnsiballZ_command.py'
Jan 26 08:35:50 compute-1 sudo[174208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:50 compute-1 python3.9[174210]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:35:50 compute-1 sudo[174208]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:51 compute-1 sudo[174361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awghzrfynlgleeuzfxpnwfpsmyhtoppf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416550.648351-1295-121038568388457/AnsiballZ_command.py'
Jan 26 08:35:51 compute-1 sudo[174361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:51 compute-1 python3.9[174363]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:35:51 compute-1 sudo[174361]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:52 compute-1 sudo[174514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnjnggyujnqsrwcybtxpsmkcfxbtjhya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416551.6814241-1295-47897401302277/AnsiballZ_command.py'
Jan 26 08:35:52 compute-1 sudo[174514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:52 compute-1 python3.9[174516]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:35:52 compute-1 sudo[174514]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:52 compute-1 sudo[174667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obmgrwndkfpgwlsdajhhbbmzuemgyuqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416552.38648-1295-95333565690820/AnsiballZ_command.py'
Jan 26 08:35:52 compute-1 sudo[174667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:52 compute-1 python3.9[174669]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:35:52 compute-1 sudo[174667]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:53 compute-1 sudo[174820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhvnufejfgpybrohywnipepzzczowwjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416553.0371172-1295-141169218486252/AnsiballZ_command.py'
Jan 26 08:35:53 compute-1 sudo[174820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:53 compute-1 python3.9[174822]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:35:53 compute-1 sudo[174820]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:54 compute-1 sudo[174973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqyzumjtikvdjwrdiuczaqgiuecyqxwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416553.680573-1295-26828272448110/AnsiballZ_command.py'
Jan 26 08:35:54 compute-1 sudo[174973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:54 compute-1 python3.9[174975]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:35:54 compute-1 sudo[174973]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:54 compute-1 sudo[175126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwneknqnassrfinrlfzbesyyxlyokcke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416554.4038863-1295-43805718938248/AnsiballZ_command.py'
Jan 26 08:35:54 compute-1 sudo[175126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:54 compute-1 python3.9[175128]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:35:54 compute-1 sudo[175126]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:57 compute-1 sudo[175279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opevboxcqbonmoxjanczpkwzjihsmiap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416556.704065-1438-140558170751195/AnsiballZ_file.py'
Jan 26 08:35:57 compute-1 sudo[175279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:57 compute-1 python3.9[175281]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:35:57 compute-1 sudo[175279]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:57 compute-1 sudo[175431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yflxqdfqeyvllszjvsaheosadvfueovc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416557.4550734-1438-174624848229547/AnsiballZ_file.py'
Jan 26 08:35:57 compute-1 sudo[175431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:58 compute-1 python3.9[175433]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:35:58 compute-1 sudo[175431]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:58 compute-1 sudo[175583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhocecnergdlcwrgqypuykhniblbnsqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416558.2038581-1438-86732401381113/AnsiballZ_file.py'
Jan 26 08:35:58 compute-1 sudo[175583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:58 compute-1 python3.9[175585]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:35:58 compute-1 sudo[175583]: pam_unix(sudo:session): session closed for user root
Jan 26 08:35:59 compute-1 sudo[175735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vojdlhpiguulpbkycldklroloywuytct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416558.9744017-1482-265542556694695/AnsiballZ_file.py'
Jan 26 08:35:59 compute-1 sudo[175735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:35:59 compute-1 python3.9[175737]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:35:59 compute-1 sudo[175735]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:00 compute-1 sudo[175887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnbbkbazpytpfskrvkkxsnboopqzfscn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416559.808226-1482-53655721723992/AnsiballZ_file.py'
Jan 26 08:36:00 compute-1 sudo[175887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:00 compute-1 python3.9[175889]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:36:00 compute-1 sudo[175887]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:01 compute-1 sudo[176039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rksrlqwvlzevoclswlbmaucosdwbxkte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416560.546275-1482-66535724158333/AnsiballZ_file.py'
Jan 26 08:36:01 compute-1 sudo[176039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:01 compute-1 python3.9[176041]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:36:01 compute-1 sudo[176039]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:01 compute-1 sudo[176191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpqgwxsdeyxwvcimyeqdxyufujjkfktf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416561.4043264-1482-100196009806379/AnsiballZ_file.py'
Jan 26 08:36:01 compute-1 sudo[176191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:01 compute-1 python3.9[176193]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:36:01 compute-1 sudo[176191]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:02 compute-1 sudo[176343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnictfxgenpkmvknqeektuuvxmeizbbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416562.0864432-1482-25683160537618/AnsiballZ_file.py'
Jan 26 08:36:02 compute-1 sudo[176343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:02 compute-1 python3.9[176345]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:36:02 compute-1 sudo[176343]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:03 compute-1 sudo[176495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyezvqdunbousoyyjuzajzsskmmxgohp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416562.8343291-1482-195909021607703/AnsiballZ_file.py'
Jan 26 08:36:03 compute-1 sudo[176495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:03 compute-1 python3.9[176497]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:36:03 compute-1 sudo[176495]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:03 compute-1 sudo[176647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjfwctpubidugxcwsrmvnsvvkutgmdxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416563.5274353-1482-158443554548202/AnsiballZ_file.py'
Jan 26 08:36:03 compute-1 sudo[176647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:03 compute-1 python3.9[176649]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:36:04 compute-1 sudo[176647]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:36:05.286 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:36:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:36:05.287 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:36:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:36:05.288 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:36:06 compute-1 podman[176674]: 2026-01-26 08:36:06.823032224 +0000 UTC m=+0.082454191 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 08:36:08 compute-1 sudo[176825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-japwefpdseayyvgpoalqygjanrmgphks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416568.3174052-1719-167633174586347/AnsiballZ_getent.py'
Jan 26 08:36:08 compute-1 sudo[176825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:09 compute-1 python3.9[176827]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 26 08:36:09 compute-1 sudo[176825]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:09 compute-1 sudo[176978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foyjbktqvwwuvkexkkbfqoafndqjjksm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416569.3437989-1735-149070525548797/AnsiballZ_group.py'
Jan 26 08:36:09 compute-1 sudo[176978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:10 compute-1 python3.9[176980]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 08:36:10 compute-1 groupadd[176981]: group added to /etc/group: name=nova, GID=42436
Jan 26 08:36:10 compute-1 groupadd[176981]: group added to /etc/gshadow: name=nova
Jan 26 08:36:10 compute-1 groupadd[176981]: new group: name=nova, GID=42436
Jan 26 08:36:10 compute-1 sudo[176978]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:10 compute-1 sudo[177147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvdgbbtxkyapglaldarldiilnlceuqol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416570.4122684-1751-214040505959802/AnsiballZ_user.py'
Jan 26 08:36:10 compute-1 sudo[177147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:10 compute-1 podman[177110]: 2026-01-26 08:36:10.963814039 +0000 UTC m=+0.077777905 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 26 08:36:11 compute-1 python3.9[177155]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 08:36:11 compute-1 useradd[177160]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Jan 26 08:36:11 compute-1 useradd[177160]: add 'nova' to group 'libvirt'
Jan 26 08:36:11 compute-1 useradd[177160]: add 'nova' to shadow group 'libvirt'
Jan 26 08:36:11 compute-1 sudo[177147]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:12 compute-1 sshd-session[177191]: Accepted publickey for zuul from 192.168.122.30 port 50470 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:36:12 compute-1 systemd-logind[788]: New session 26 of user zuul.
Jan 26 08:36:12 compute-1 systemd[1]: Started Session 26 of User zuul.
Jan 26 08:36:12 compute-1 sshd-session[177191]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:36:12 compute-1 sshd-session[177194]: Received disconnect from 192.168.122.30 port 50470:11: disconnected by user
Jan 26 08:36:12 compute-1 sshd-session[177194]: Disconnected from user zuul 192.168.122.30 port 50470
Jan 26 08:36:12 compute-1 sshd-session[177191]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:36:12 compute-1 systemd[1]: session-26.scope: Deactivated successfully.
Jan 26 08:36:12 compute-1 systemd-logind[788]: Session 26 logged out. Waiting for processes to exit.
Jan 26 08:36:12 compute-1 systemd-logind[788]: Removed session 26.
Jan 26 08:36:13 compute-1 python3.9[177344]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:36:13 compute-1 python3.9[177465]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416572.7493625-1801-180433806019890/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:36:14 compute-1 python3.9[177615]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:36:15 compute-1 python3.9[177691]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:36:15 compute-1 python3.9[177841]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:36:16 compute-1 python3.9[177962]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416575.3207872-1801-138221264241697/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:36:17 compute-1 python3.9[178112]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:36:17 compute-1 python3.9[178233]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416576.8096998-1801-150276752313077/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:36:18 compute-1 python3.9[178383]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:36:19 compute-1 python3.9[178504]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416578.1199644-1801-231805102049466/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:36:20 compute-1 python3.9[178654]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:36:20 compute-1 python3.9[178775]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416579.4578223-1801-90241167973809/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:36:21 compute-1 sudo[178926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvigaxtkxtmximwebpzypkzuapadqrsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416580.9166448-1967-131605859618146/AnsiballZ_file.py'
Jan 26 08:36:21 compute-1 sudo[178926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:21 compute-1 python3.9[178929]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:36:21 compute-1 sudo[178926]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:22 compute-1 sshd-session[178879]: Connection closed by authenticating user root 159.223.236.81 port 57772 [preauth]
Jan 26 08:36:22 compute-1 sudo[179079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxpydhqnvfvwmobtdabsjrwwbkghalsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416581.9356053-1983-156482361957105/AnsiballZ_copy.py'
Jan 26 08:36:22 compute-1 sudo[179079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:22 compute-1 python3.9[179081]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:36:22 compute-1 sudo[179079]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:23 compute-1 sudo[179231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpbehaaogqvzmvjorfvadyzznudpzdkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416582.8072598-1999-3160954519232/AnsiballZ_stat.py'
Jan 26 08:36:23 compute-1 sudo[179231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:23 compute-1 python3.9[179233]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:36:23 compute-1 sudo[179231]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:23 compute-1 sudo[179383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhkiaukeiocfiaxbcrsxjufwmfupdwwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416583.672986-2017-152692901599080/AnsiballZ_stat.py'
Jan 26 08:36:23 compute-1 sudo[179383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:24 compute-1 python3.9[179385]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:36:24 compute-1 sudo[179383]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:24 compute-1 sudo[179506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fihabqxxzcfoluxmhvlojviknntuxoez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416583.672986-2017-152692901599080/AnsiballZ_copy.py'
Jan 26 08:36:24 compute-1 sudo[179506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:24 compute-1 python3.9[179508]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769416583.672986-2017-152692901599080/.source _original_basename=.og8b__pz follow=False checksum=1c400cae94614586e5a1e2b7588522856093ef78 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 26 08:36:25 compute-1 sudo[179506]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:25 compute-1 python3.9[179660]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:36:26 compute-1 python3.9[179812]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:36:27 compute-1 python3.9[179933]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416586.2370882-2067-64371572278426/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:36:28 compute-1 python3.9[180083]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:36:28 compute-1 python3.9[180204]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416587.6364238-2097-134215622258328/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:36:29 compute-1 sudo[180354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pggigvagefwrfvhmxyeoglkdzylmpdkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416589.1660943-2131-261408252930156/AnsiballZ_container_config_data.py'
Jan 26 08:36:29 compute-1 sudo[180354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:29 compute-1 python3.9[180356]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 26 08:36:29 compute-1 sudo[180354]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:30 compute-1 sudo[180506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mecbpsgxficoibmgowqguvyufsvnxckq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416590.207082-2153-227338686954490/AnsiballZ_container_config_hash.py'
Jan 26 08:36:30 compute-1 sudo[180506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:30 compute-1 python3.9[180508]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 08:36:30 compute-1 sudo[180506]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:31 compute-1 sudo[180658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfpuoesylaenqtiefbquxxhtvnptknrc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769416591.2571943-2173-264995217519706/AnsiballZ_edpm_container_manage.py'
Jan 26 08:36:31 compute-1 sudo[180658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:32 compute-1 python3[180660]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 08:36:32 compute-1 podman[180697]: 2026-01-26 08:36:32.386906554 +0000 UTC m=+0.076368557 container create 51e970b76d0c7b5ad916b10ba5ea51ec0d6abd2c52bc170b9cf52ad0431fa110 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 26 08:36:32 compute-1 podman[180697]: 2026-01-26 08:36:32.347532564 +0000 UTC m=+0.036994617 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 26 08:36:32 compute-1 python3[180660]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 26 08:36:32 compute-1 sudo[180658]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:33 compute-1 sudo[180885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwrnyywhemfrssteqjlylgjneqjseznj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416592.805213-2189-140791224137617/AnsiballZ_stat.py'
Jan 26 08:36:33 compute-1 sudo[180885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:33 compute-1 python3.9[180887]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:36:33 compute-1 sudo[180885]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:34 compute-1 sudo[181039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yknbqktkicgrrkmdeufxzuysjvkwwesj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416594.0183737-2213-2157527571125/AnsiballZ_container_config_data.py'
Jan 26 08:36:34 compute-1 sudo[181039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:34 compute-1 python3.9[181041]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 26 08:36:34 compute-1 sudo[181039]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:35 compute-1 sudo[181191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvfmgkwmtruirypdysanufrijpcztpcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416595.0439053-2235-199961908186753/AnsiballZ_container_config_hash.py'
Jan 26 08:36:35 compute-1 sudo[181191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:35 compute-1 python3.9[181193]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 08:36:35 compute-1 sudo[181191]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:36 compute-1 sudo[181343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjhipexkfgqflntqokocpqygiqidxmwu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769416596.236957-2255-275583372746856/AnsiballZ_edpm_container_manage.py'
Jan 26 08:36:36 compute-1 sudo[181343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:36 compute-1 python3[181345]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 08:36:37 compute-1 podman[181382]: 2026-01-26 08:36:37.154796202 +0000 UTC m=+0.056795200 container create 8f96135535b280f182dc1f45e2c8ecacc1c65e2eb6ca70622f64f1c31d5768dd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, container_name=nova_compute)
Jan 26 08:36:37 compute-1 podman[181382]: 2026-01-26 08:36:37.12833919 +0000 UTC m=+0.030338168 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 26 08:36:37 compute-1 python3[181345]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 26 08:36:37 compute-1 sudo[181343]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:37 compute-1 podman[181509]: 2026-01-26 08:36:37.867378366 +0000 UTC m=+0.130719680 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 26 08:36:37 compute-1 sudo[181596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwsyrwzwstlskcjucoocdjoddvmiyfvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416597.5730553-2271-54103098263516/AnsiballZ_stat.py'
Jan 26 08:36:37 compute-1 sudo[181596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:38 compute-1 python3.9[181598]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:36:38 compute-1 sudo[181596]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:38 compute-1 sudo[181750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hchnpjcmwonykcttfmmycnavtwwfhjxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416598.4320502-2289-256650676729150/AnsiballZ_file.py'
Jan 26 08:36:38 compute-1 sudo[181750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:39 compute-1 python3.9[181752]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:36:39 compute-1 sudo[181750]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:39 compute-1 sudo[181901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whjzkkxeoyfznwkdvpyodffdgeayzzzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416599.1059268-2289-64667986372364/AnsiballZ_copy.py'
Jan 26 08:36:39 compute-1 sudo[181901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:39 compute-1 python3.9[181903]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769416599.1059268-2289-64667986372364/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:36:39 compute-1 sudo[181901]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:40 compute-1 sudo[181977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoyewwkeytomegbjllxveuavsvyrnkff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416599.1059268-2289-64667986372364/AnsiballZ_systemd.py'
Jan 26 08:36:40 compute-1 sudo[181977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:40 compute-1 python3.9[181979]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 08:36:40 compute-1 systemd[1]: Reloading.
Jan 26 08:36:40 compute-1 systemd-sysv-generator[182005]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:36:40 compute-1 systemd-rc-local-generator[182002]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:36:40 compute-1 sudo[181977]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:41 compute-1 sudo[182101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnnxocnwrofvmxtqqkzbdkhtigixzxna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416599.1059268-2289-64667986372364/AnsiballZ_systemd.py'
Jan 26 08:36:41 compute-1 sudo[182101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:41 compute-1 podman[182062]: 2026-01-26 08:36:41.309534583 +0000 UTC m=+0.086494760 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 08:36:41 compute-1 python3.9[182109]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:36:41 compute-1 systemd[1]: Reloading.
Jan 26 08:36:41 compute-1 systemd-sysv-generator[182143]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:36:41 compute-1 systemd-rc-local-generator[182140]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:36:41 compute-1 systemd[1]: Starting nova_compute container...
Jan 26 08:36:42 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:36:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/853db5c305a65b2c72404f25a0ad80e9e27f1f1e3499bb8f9f0d18250b4276f1/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 26 08:36:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/853db5c305a65b2c72404f25a0ad80e9e27f1f1e3499bb8f9f0d18250b4276f1/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 26 08:36:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/853db5c305a65b2c72404f25a0ad80e9e27f1f1e3499bb8f9f0d18250b4276f1/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 26 08:36:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/853db5c305a65b2c72404f25a0ad80e9e27f1f1e3499bb8f9f0d18250b4276f1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 26 08:36:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/853db5c305a65b2c72404f25a0ad80e9e27f1f1e3499bb8f9f0d18250b4276f1/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 26 08:36:42 compute-1 podman[182149]: 2026-01-26 08:36:42.121919154 +0000 UTC m=+0.129220050 container init 8f96135535b280f182dc1f45e2c8ecacc1c65e2eb6ca70622f64f1c31d5768dd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, managed_by=edpm_ansible)
Jan 26 08:36:42 compute-1 podman[182149]: 2026-01-26 08:36:42.13552581 +0000 UTC m=+0.142826656 container start 8f96135535b280f182dc1f45e2c8ecacc1c65e2eb6ca70622f64f1c31d5768dd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 26 08:36:42 compute-1 podman[182149]: nova_compute
Jan 26 08:36:42 compute-1 nova_compute[182165]: + sudo -E kolla_set_configs
Jan 26 08:36:42 compute-1 systemd[1]: Started nova_compute container.
Jan 26 08:36:42 compute-1 sudo[182101]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Validating config file
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Copying service configuration files
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Deleting /etc/ceph
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Creating directory /etc/ceph
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Setting permission for /etc/ceph
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Writing out command to execute
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 08:36:42 compute-1 nova_compute[182165]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 08:36:42 compute-1 nova_compute[182165]: ++ cat /run_command
Jan 26 08:36:42 compute-1 nova_compute[182165]: + CMD=nova-compute
Jan 26 08:36:42 compute-1 nova_compute[182165]: + ARGS=
Jan 26 08:36:42 compute-1 nova_compute[182165]: + sudo kolla_copy_cacerts
Jan 26 08:36:42 compute-1 nova_compute[182165]: + [[ ! -n '' ]]
Jan 26 08:36:42 compute-1 nova_compute[182165]: + . kolla_extend_start
Jan 26 08:36:42 compute-1 nova_compute[182165]: Running command: 'nova-compute'
Jan 26 08:36:42 compute-1 nova_compute[182165]: + echo 'Running command: '\''nova-compute'\'''
Jan 26 08:36:42 compute-1 nova_compute[182165]: + umask 0022
Jan 26 08:36:42 compute-1 nova_compute[182165]: + exec nova-compute
Jan 26 08:36:43 compute-1 python3.9[182327]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:36:44 compute-1 python3.9[182477]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:36:44 compute-1 nova_compute[182165]: 2026-01-26 08:36:44.196 182169 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 26 08:36:44 compute-1 nova_compute[182165]: 2026-01-26 08:36:44.196 182169 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 26 08:36:44 compute-1 nova_compute[182165]: 2026-01-26 08:36:44.196 182169 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 26 08:36:44 compute-1 nova_compute[182165]: 2026-01-26 08:36:44.196 182169 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 26 08:36:44 compute-1 nova_compute[182165]: 2026-01-26 08:36:44.326 182169 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:36:44 compute-1 nova_compute[182165]: 2026-01-26 08:36:44.356 182169 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:36:44 compute-1 nova_compute[182165]: 2026-01-26 08:36:44.356 182169 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 26 08:36:45 compute-1 python3.9[182631]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:36:45 compute-1 nova_compute[182165]: 2026-01-26 08:36:45.086 182169 INFO nova.virt.driver [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 26 08:36:45 compute-1 nova_compute[182165]: 2026-01-26 08:36:45.340 182169 INFO nova.compute.provider_config [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.169 182169 DEBUG oslo_concurrency.lockutils [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.170 182169 DEBUG oslo_concurrency.lockutils [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.170 182169 DEBUG oslo_concurrency.lockutils [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.171 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.171 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.171 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.171 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.171 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.172 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.172 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.172 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.172 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.172 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.173 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.173 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.173 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.173 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.174 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.174 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.174 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.174 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.174 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.175 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.175 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.175 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.175 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.175 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.176 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.176 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.176 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.176 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.177 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.177 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.177 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.177 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.177 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.178 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.178 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.178 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.178 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.179 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.179 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.179 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.179 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.180 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.180 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.180 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.180 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.181 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.181 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.181 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.181 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.181 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.182 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.182 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.182 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.182 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.183 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.183 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.183 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.183 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.183 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.183 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.184 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.184 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.184 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.184 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.184 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.185 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.185 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.185 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.185 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.186 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.186 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.186 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.186 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.186 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.187 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.187 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.187 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.187 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.187 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.187 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.187 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.188 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.188 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.188 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.188 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.188 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.188 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.188 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.189 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.189 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.189 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.189 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.189 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.189 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.189 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.189 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.190 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.190 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.190 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.190 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.190 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.190 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.190 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.190 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.191 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.191 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.191 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.191 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.191 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.191 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.191 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.192 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.192 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.192 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.192 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.192 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.192 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.192 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.193 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.193 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.193 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.193 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.193 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.193 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.193 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.193 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.194 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.194 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.194 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.194 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.194 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.194 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.194 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.194 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.195 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.195 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.195 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.195 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.195 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.195 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.195 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.195 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.196 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.196 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.196 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.196 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.196 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.196 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.196 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.197 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.197 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.197 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.197 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.197 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.197 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.197 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.198 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.198 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.198 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.198 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.198 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.198 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.198 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.198 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.199 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.199 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.199 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.199 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.199 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.199 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.200 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.200 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.200 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.200 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.200 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.200 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.200 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.201 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.201 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.201 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.201 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.201 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.201 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.201 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.202 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.202 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.202 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.202 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.202 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.202 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.202 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.203 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.203 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.203 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.203 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.203 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.203 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.203 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.204 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.204 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.204 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.204 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.204 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.204 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.204 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.204 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.205 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.205 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.205 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.205 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.205 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.205 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.205 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.206 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.206 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.206 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.206 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.206 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.206 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.206 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.206 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.207 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.207 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.207 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.207 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.207 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.207 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.207 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.208 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.208 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.208 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.208 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.208 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.208 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.208 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.208 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.209 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.209 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.209 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.209 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.209 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.209 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.209 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.210 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.210 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.210 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.210 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.210 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.210 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.210 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.210 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.211 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.211 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.211 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.211 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.211 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.211 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.211 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.212 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.212 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.212 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.212 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.212 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.212 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.212 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.213 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.213 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.213 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.213 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.213 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.213 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.213 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.213 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.214 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.214 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.214 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.214 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.214 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.214 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.214 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.214 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.215 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.215 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.215 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.215 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.215 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.215 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.215 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.216 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.216 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.216 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.216 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.216 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.216 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.216 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.217 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.217 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.217 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.217 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.217 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.217 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.217 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.218 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.218 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.218 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.218 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.218 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.218 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.218 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.219 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.219 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.219 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.219 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.219 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.219 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.219 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.219 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.220 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.220 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.220 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.220 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.220 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.220 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.220 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.221 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.221 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.221 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.221 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.221 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.221 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.221 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.221 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.222 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.222 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.222 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.222 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.222 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.222 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.222 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.223 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.223 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.223 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.223 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.223 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.223 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.223 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.223 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.224 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.224 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.224 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.224 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.224 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.224 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.224 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.225 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.225 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.225 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.225 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.225 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.225 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.226 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.226 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.226 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.226 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.226 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.226 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.226 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.226 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.227 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.227 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.227 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.227 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.227 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.227 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.227 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.228 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.228 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.228 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.228 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.228 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.228 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.228 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.228 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.229 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.229 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.229 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.229 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.229 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.229 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.229 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.230 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.230 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.230 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.230 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.230 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.230 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.230 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.230 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.231 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.231 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.231 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.231 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.231 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.231 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.231 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.232 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.232 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.232 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.232 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.232 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.232 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.232 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.232 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.233 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.233 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.233 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.233 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.233 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.233 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.234 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.234 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.234 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.234 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.234 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.234 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.234 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.235 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.235 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.235 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.235 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.235 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.235 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.235 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.235 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.236 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.236 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.236 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.236 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.236 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.236 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.236 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.237 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.237 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.237 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.237 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.237 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.237 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.237 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.237 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.238 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.238 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.238 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.238 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.238 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.238 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.239 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.239 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.239 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.239 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.239 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.239 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.239 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.239 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.240 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.240 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.240 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.240 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.240 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.240 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.240 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.241 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.241 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.241 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.241 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.241 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.241 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.241 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.241 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.242 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.242 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.242 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.242 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.242 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.242 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.242 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.243 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.243 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.243 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.243 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.243 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.243 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.244 182169 WARNING oslo_config.cfg [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 26 08:36:46 compute-1 nova_compute[182165]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 26 08:36:46 compute-1 nova_compute[182165]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 26 08:36:46 compute-1 nova_compute[182165]: and ``live_migration_inbound_addr`` respectively.
Jan 26 08:36:46 compute-1 nova_compute[182165]: ).  Its value may be silently ignored in the future.
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.244 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.244 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.244 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.244 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.244 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.245 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.245 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.245 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.245 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.245 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.245 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.245 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.246 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.246 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.246 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.246 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.246 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.246 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.246 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.247 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.247 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.247 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.247 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.247 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.247 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.247 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.248 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.248 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.248 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.248 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.248 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.248 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.249 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.249 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.249 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.249 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.249 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.249 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.249 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.250 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.250 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.250 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.250 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.250 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.250 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.250 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.251 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.251 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.251 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.251 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.251 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.252 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.252 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.252 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.252 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.252 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.252 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.252 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.253 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.253 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.253 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.253 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.253 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.253 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.253 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.254 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.254 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.254 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.254 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.254 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.254 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.254 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.255 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.255 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.255 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.255 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.255 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.255 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.255 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.255 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.256 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.256 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.256 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.256 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.256 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.257 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.257 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.257 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.257 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.257 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.257 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.257 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.258 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.258 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.258 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.258 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.258 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.258 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.259 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.259 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.259 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.259 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.259 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.259 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.259 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.260 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.260 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.260 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.260 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.260 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.260 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.260 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.261 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.261 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.261 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.261 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.261 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.261 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.261 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.262 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.262 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.262 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.262 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.262 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.262 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.263 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.263 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.263 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.263 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.263 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.263 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.263 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.264 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.264 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.264 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.264 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.264 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.264 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.265 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.265 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.265 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.265 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.265 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.265 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.266 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.266 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.266 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.266 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.266 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.266 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.266 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.267 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.267 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.267 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.267 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.267 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.267 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.267 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.268 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.268 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.268 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.268 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.268 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.268 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.269 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.269 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.269 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.269 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.269 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.269 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.269 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.270 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.270 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.270 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.270 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.270 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.270 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.271 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.271 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.271 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.271 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.271 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.271 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.271 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.272 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.272 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.272 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.272 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.272 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.273 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.273 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.273 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.273 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.273 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.273 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.273 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.274 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.274 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.274 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.274 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.274 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.274 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.275 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.275 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.275 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.275 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.275 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.275 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.275 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.276 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.276 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.276 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.276 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.276 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.276 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.276 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.277 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.277 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.277 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.277 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.277 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.277 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.278 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.278 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.278 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.278 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.278 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.278 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.279 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.279 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.279 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.279 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.279 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.279 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.279 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.280 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.280 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.280 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.280 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.280 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.280 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.280 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.281 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.281 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.281 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.281 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.281 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.281 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.282 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.282 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.282 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.282 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.282 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.282 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.283 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.283 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.283 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.283 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.283 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.283 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.283 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.284 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.284 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.284 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.284 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.284 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.284 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.285 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.285 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.285 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.285 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.285 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.285 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.285 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.285 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.286 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.286 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.286 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.286 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.286 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.286 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.286 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.287 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.287 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.287 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.287 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.287 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.288 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.288 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.288 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.288 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.288 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.288 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.288 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.289 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.289 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.289 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.289 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.289 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.289 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.289 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.290 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.290 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.290 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.290 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.290 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.290 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.290 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.291 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.291 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.291 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.291 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.291 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.291 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.291 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.292 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.292 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.292 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.292 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.292 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.292 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.293 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.293 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.293 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.293 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.293 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.293 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.293 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.293 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.294 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.294 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.294 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.294 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.294 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.294 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.294 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.295 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.295 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.295 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.295 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.295 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.295 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.296 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.296 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.296 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.296 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.296 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.296 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.297 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.297 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.297 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.297 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.297 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.297 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.297 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.298 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.298 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.298 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.298 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.298 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.299 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.299 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.299 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.299 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.299 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.299 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.299 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.300 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.300 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.300 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.300 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.300 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.300 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.300 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.301 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.301 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.301 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.301 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.301 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.301 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.301 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.302 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.302 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.302 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.302 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.302 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.302 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.302 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.303 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.303 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.303 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.303 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.303 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.303 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.303 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.303 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.304 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.304 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.304 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.304 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.304 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.304 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.304 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.305 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.305 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.305 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.305 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.305 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.306 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.306 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.306 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.306 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.306 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.306 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.306 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.307 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.307 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.307 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.307 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.307 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.307 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.307 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.307 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.308 182169 DEBUG oslo_service.service [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.309 182169 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.354 182169 DEBUG nova.virt.libvirt.host [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.355 182169 DEBUG nova.virt.libvirt.host [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.355 182169 DEBUG nova.virt.libvirt.host [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.356 182169 DEBUG nova.virt.libvirt.host [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 26 08:36:46 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Jan 26 08:36:46 compute-1 systemd[1]: Started libvirt QEMU daemon.
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.431 182169 DEBUG nova.virt.libvirt.host [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fcd9ad6cdc0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.435 182169 DEBUG nova.virt.libvirt.host [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fcd9ad6cdc0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.437 182169 INFO nova.virt.libvirt.driver [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Connection event '1' reason 'None'
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.483 182169 WARNING nova.virt.libvirt.driver [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Jan 26 08:36:46 compute-1 nova_compute[182165]: 2026-01-26 08:36:46.483 182169 DEBUG nova.virt.libvirt.volume.mount [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 26 08:36:46 compute-1 sudo[182832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psbomwjpooxbjsdrrznkaovccrthpjpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416605.5541272-2409-23178438486661/AnsiballZ_podman_container.py'
Jan 26 08:36:46 compute-1 sudo[182832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:46 compute-1 python3.9[182835]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 26 08:36:46 compute-1 sudo[182832]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:46 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 08:36:46 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.394 182169 INFO nova.virt.libvirt.host [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Libvirt host capabilities <capabilities>
Jan 26 08:36:47 compute-1 nova_compute[182165]: 
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <host>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <uuid>99f84307-5f5c-4de6-9e22-fda82fab04a3</uuid>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <cpu>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <arch>x86_64</arch>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model>EPYC-Rome-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <vendor>AMD</vendor>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <microcode version='16777317'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <signature family='23' model='49' stepping='0'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='x2apic'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='tsc-deadline'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='osxsave'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='hypervisor'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='tsc_adjust'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='spec-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='stibp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='arch-capabilities'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='ssbd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='cmp_legacy'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='topoext'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='virt-ssbd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='lbrv'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='tsc-scale'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='vmcb-clean'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='pause-filter'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='pfthreshold'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='svme-addr-chk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='rdctl-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='skip-l1dfl-vmentry'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='mds-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature name='pschange-mc-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <pages unit='KiB' size='4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <pages unit='KiB' size='2048'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <pages unit='KiB' size='1048576'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </cpu>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <power_management>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <suspend_mem/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <suspend_disk/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <suspend_hybrid/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </power_management>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <iommu support='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <migration_features>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <live/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <uri_transports>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <uri_transport>tcp</uri_transport>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <uri_transport>rdma</uri_transport>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </uri_transports>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </migration_features>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <topology>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <cells num='1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <cell id='0'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:           <memory unit='KiB'>16109544</memory>
Jan 26 08:36:47 compute-1 nova_compute[182165]:           <pages unit='KiB' size='4'>4027386</pages>
Jan 26 08:36:47 compute-1 nova_compute[182165]:           <pages unit='KiB' size='2048'>0</pages>
Jan 26 08:36:47 compute-1 nova_compute[182165]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 26 08:36:47 compute-1 nova_compute[182165]:           <distances>
Jan 26 08:36:47 compute-1 nova_compute[182165]:             <sibling id='0' value='10'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:           </distances>
Jan 26 08:36:47 compute-1 nova_compute[182165]:           <cpus num='8'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:           </cpus>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         </cell>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </cells>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </topology>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <cache>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </cache>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <secmodel>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model>selinux</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <doi>0</doi>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </secmodel>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <secmodel>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model>dac</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <doi>0</doi>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </secmodel>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </host>
Jan 26 08:36:47 compute-1 nova_compute[182165]: 
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <guest>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <os_type>hvm</os_type>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <arch name='i686'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <wordsize>32</wordsize>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <domain type='qemu'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <domain type='kvm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </arch>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <features>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <pae/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <nonpae/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <acpi default='on' toggle='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <apic default='on' toggle='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <cpuselection/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <deviceboot/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <disksnapshot default='on' toggle='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <externalSnapshot/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </features>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </guest>
Jan 26 08:36:47 compute-1 nova_compute[182165]: 
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <guest>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <os_type>hvm</os_type>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <arch name='x86_64'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <wordsize>64</wordsize>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <domain type='qemu'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <domain type='kvm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </arch>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <features>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <acpi default='on' toggle='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <apic default='on' toggle='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <cpuselection/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <deviceboot/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <disksnapshot default='on' toggle='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <externalSnapshot/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </features>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </guest>
Jan 26 08:36:47 compute-1 nova_compute[182165]: 
Jan 26 08:36:47 compute-1 nova_compute[182165]: </capabilities>
Jan 26 08:36:47 compute-1 nova_compute[182165]: 
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.405 182169 DEBUG nova.virt.libvirt.host [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.438 182169 DEBUG nova.virt.libvirt.host [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 26 08:36:47 compute-1 nova_compute[182165]: <domainCapabilities>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <domain>kvm</domain>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <arch>i686</arch>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <vcpu max='240'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <iothreads supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <os supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <enum name='firmware'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <loader supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>rom</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pflash</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='readonly'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>yes</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>no</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='secure'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>no</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </loader>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </os>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <cpu>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <mode name='host-passthrough' supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='hostPassthroughMigratable'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>on</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>off</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </mode>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <mode name='maximum' supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='maximumMigratable'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>on</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>off</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </mode>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <mode name='host-model' supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <vendor>AMD</vendor>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='x2apic'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='hypervisor'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='stibp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='ssbd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='overflow-recov'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='succor'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='lbrv'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='tsc-scale'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='flushbyasid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='pause-filter'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='pfthreshold'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='disable' name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </mode>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <mode name='custom' supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-noTSX'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='ClearwaterForest'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ddpd-u'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='intel-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='lam'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sha512'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sm3'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sm4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='ClearwaterForest-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ddpd-u'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='intel-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='lam'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sha512'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sm3'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sm4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cooperlake'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cooperlake-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cooperlake-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Denverton'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mpx'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Denverton-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mpx'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Denverton-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Denverton-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Dhyana-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Genoa'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='perfmon-v2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Milan'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Milan-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Milan-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Milan-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Rome'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Rome-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Rome-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Rome-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Turin'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vp2intersect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibpb-brtype'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='perfmon-v2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbpb'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='srso-user-kernel-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Turin-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vp2intersect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibpb-brtype'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='perfmon-v2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbpb'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='srso-user-kernel-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-v5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='GraniteRapids'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='GraniteRapids-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='GraniteRapids-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-128'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-256'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-512'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='GraniteRapids-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-128'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-256'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-512'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-noTSX'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v6'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v7'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='IvyBridge'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='IvyBridge-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='IvyBridge-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='IvyBridge-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='KnightsMill'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-4fmaps'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-4vnniw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512er'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512pf'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='KnightsMill-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-4fmaps'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-4vnniw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512er'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512pf'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Opteron_G4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fma4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xop'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Opteron_G4-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fma4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xop'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Opteron_G5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fma4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tbm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xop'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Opteron_G5-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fma4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tbm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xop'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SierraForest'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SierraForest-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SierraForest-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='intel-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='lam'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SierraForest-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='intel-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='lam'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='core-capability'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mpx'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='split-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='core-capability'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mpx'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='split-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='core-capability'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='split-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='core-capability'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='split-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='athlon'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnow'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnowext'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='athlon-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnow'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnowext'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='core2duo'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='core2duo-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='coreduo'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='coreduo-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='n270'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='n270-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='phenom'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnow'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnowext'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='phenom-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnow'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnowext'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </mode>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </cpu>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <memoryBacking supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <enum name='sourceType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>file</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>anonymous</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>memfd</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </memoryBacking>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <devices>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <disk supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='diskDevice'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>disk</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>cdrom</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>floppy</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>lun</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='bus'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>ide</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>fdc</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>scsi</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>usb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>sata</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio-transitional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio-non-transitional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </disk>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <graphics supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vnc</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>egl-headless</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>dbus</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </graphics>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <video supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='modelType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vga</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>cirrus</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>none</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>bochs</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>ramfb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </video>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <hostdev supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='mode'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>subsystem</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='startupPolicy'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>default</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>mandatory</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>requisite</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>optional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='subsysType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>usb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pci</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>scsi</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='capsType'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='pciBackend'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </hostdev>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <rng supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio-transitional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio-non-transitional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendModel'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>random</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>egd</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>builtin</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </rng>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <filesystem supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='driverType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>path</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>handle</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtiofs</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </filesystem>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <tpm supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>tpm-tis</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>tpm-crb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendModel'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>emulator</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>external</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendVersion'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>2.0</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </tpm>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <redirdev supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='bus'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>usb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </redirdev>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <channel supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pty</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>unix</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </channel>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <crypto supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>qemu</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendModel'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>builtin</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </crypto>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <interface supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>default</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>passt</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </interface>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <panic supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>isa</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>hyperv</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </panic>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <console supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>null</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vc</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pty</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>dev</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>file</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pipe</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>stdio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>udp</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>tcp</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>unix</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>qemu-vdagent</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>dbus</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </console>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </devices>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <features>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <gic supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <vmcoreinfo supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <genid supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <backingStoreInput supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <backup supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <async-teardown supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <s390-pv supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <ps2 supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <tdx supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <sev supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <sgx supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <hyperv supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='features'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>relaxed</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vapic</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>spinlocks</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vpindex</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>runtime</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>synic</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>stimer</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>reset</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vendor_id</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>frequencies</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>reenlightenment</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>tlbflush</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>ipi</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>avic</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>emsr_bitmap</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>xmm_input</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <defaults>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <spinlocks>4095</spinlocks>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <stimer_direct>on</stimer_direct>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </defaults>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </hyperv>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <launchSecurity supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </features>
Jan 26 08:36:47 compute-1 nova_compute[182165]: </domainCapabilities>
Jan 26 08:36:47 compute-1 nova_compute[182165]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.448 182169 DEBUG nova.virt.libvirt.host [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 26 08:36:47 compute-1 nova_compute[182165]: <domainCapabilities>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <domain>kvm</domain>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <arch>i686</arch>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <vcpu max='4096'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <iothreads supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <os supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <enum name='firmware'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <loader supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>rom</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pflash</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='readonly'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>yes</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>no</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='secure'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>no</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </loader>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </os>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <cpu>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <mode name='host-passthrough' supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='hostPassthroughMigratable'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>on</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>off</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </mode>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <mode name='maximum' supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='maximumMigratable'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>on</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>off</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </mode>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <mode name='host-model' supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <vendor>AMD</vendor>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='x2apic'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='hypervisor'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='stibp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='ssbd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='overflow-recov'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='succor'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='lbrv'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='tsc-scale'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='flushbyasid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='pause-filter'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='pfthreshold'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='disable' name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </mode>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <mode name='custom' supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-noTSX'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='ClearwaterForest'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ddpd-u'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='intel-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='lam'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sha512'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sm3'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sm4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='ClearwaterForest-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ddpd-u'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='intel-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='lam'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sha512'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sm3'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sm4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cooperlake'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cooperlake-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cooperlake-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Denverton'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mpx'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Denverton-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mpx'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Denverton-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Denverton-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Dhyana-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Genoa'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='perfmon-v2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Milan'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Milan-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Milan-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Milan-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Rome'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Rome-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Rome-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Rome-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Turin'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vp2intersect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibpb-brtype'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='perfmon-v2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbpb'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='srso-user-kernel-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Turin-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vp2intersect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibpb-brtype'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='perfmon-v2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbpb'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='srso-user-kernel-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-v5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='GraniteRapids'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='GraniteRapids-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='GraniteRapids-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-128'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-256'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-512'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='GraniteRapids-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-128'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-256'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-512'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-noTSX'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v6'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v7'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='IvyBridge'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='IvyBridge-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='IvyBridge-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='IvyBridge-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='KnightsMill'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-4fmaps'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-4vnniw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512er'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512pf'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='KnightsMill-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-4fmaps'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-4vnniw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512er'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512pf'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Opteron_G4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fma4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xop'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Opteron_G4-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fma4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xop'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Opteron_G5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fma4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tbm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xop'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Opteron_G5-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fma4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tbm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xop'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SierraForest'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SierraForest-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SierraForest-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='intel-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='lam'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SierraForest-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='intel-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='lam'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='core-capability'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mpx'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='split-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='core-capability'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mpx'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='split-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='core-capability'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='split-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='core-capability'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='split-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='athlon'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnow'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnowext'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='athlon-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnow'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnowext'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='core2duo'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='core2duo-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='coreduo'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='coreduo-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='n270'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='n270-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='phenom'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnow'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnowext'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='phenom-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnow'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnowext'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </mode>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </cpu>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <memoryBacking supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <enum name='sourceType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>file</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>anonymous</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>memfd</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </memoryBacking>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <devices>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <disk supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='diskDevice'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>disk</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>cdrom</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>floppy</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>lun</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='bus'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>fdc</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>scsi</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>usb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>sata</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio-transitional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio-non-transitional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </disk>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <graphics supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vnc</value>
Jan 26 08:36:47 compute-1 sudo[183021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruukplddiawwyfqhyddblhqucrtxkajq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416607.2313116-2425-272883236032962/AnsiballZ_systemd.py'
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>egl-headless</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>dbus</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </graphics>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <video supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='modelType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vga</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>cirrus</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>none</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>bochs</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>ramfb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </video>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <hostdev supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='mode'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>subsystem</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='startupPolicy'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>default</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>mandatory</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>requisite</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>optional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='subsysType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>usb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pci</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>scsi</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='capsType'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='pciBackend'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </hostdev>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <rng supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio-transitional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio-non-transitional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendModel'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>random</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>egd</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>builtin</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </rng>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <filesystem supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='driverType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>path</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>handle</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtiofs</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </filesystem>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <tpm supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>tpm-tis</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>tpm-crb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendModel'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>emulator</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>external</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendVersion'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>2.0</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </tpm>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <redirdev supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='bus'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>usb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </redirdev>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <channel supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pty</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>unix</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </channel>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <crypto supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>qemu</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendModel'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>builtin</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </crypto>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <interface supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>default</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>passt</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </interface>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <panic supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>isa</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>hyperv</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </panic>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <console supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>null</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vc</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pty</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>dev</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>file</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pipe</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>stdio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>udp</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>tcp</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>unix</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>qemu-vdagent</value>
Jan 26 08:36:47 compute-1 sudo[183021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>dbus</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </console>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </devices>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <features>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <gic supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <vmcoreinfo supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <genid supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <backingStoreInput supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <backup supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <async-teardown supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <s390-pv supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <ps2 supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <tdx supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <sev supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <sgx supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <hyperv supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='features'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>relaxed</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vapic</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>spinlocks</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vpindex</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>runtime</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>synic</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>stimer</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>reset</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vendor_id</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>frequencies</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>reenlightenment</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>tlbflush</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>ipi</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>avic</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>emsr_bitmap</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>xmm_input</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <defaults>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <spinlocks>4095</spinlocks>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <stimer_direct>on</stimer_direct>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </defaults>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </hyperv>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <launchSecurity supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </features>
Jan 26 08:36:47 compute-1 nova_compute[182165]: </domainCapabilities>
Jan 26 08:36:47 compute-1 nova_compute[182165]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.519 182169 DEBUG nova.virt.libvirt.host [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.526 182169 DEBUG nova.virt.libvirt.host [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 26 08:36:47 compute-1 nova_compute[182165]: <domainCapabilities>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <domain>kvm</domain>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <arch>x86_64</arch>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <vcpu max='240'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <iothreads supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <os supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <enum name='firmware'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <loader supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>rom</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pflash</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='readonly'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>yes</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>no</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='secure'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>no</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </loader>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </os>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <cpu>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <mode name='host-passthrough' supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='hostPassthroughMigratable'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>on</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>off</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </mode>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <mode name='maximum' supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='maximumMigratable'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>on</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>off</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </mode>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <mode name='host-model' supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <vendor>AMD</vendor>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='x2apic'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='hypervisor'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='stibp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='ssbd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='overflow-recov'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='succor'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='lbrv'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='tsc-scale'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='flushbyasid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='pause-filter'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='pfthreshold'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='disable' name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </mode>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <mode name='custom' supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-noTSX'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='ClearwaterForest'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ddpd-u'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='intel-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='lam'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sha512'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sm3'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sm4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='ClearwaterForest-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ddpd-u'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='intel-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='lam'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sha512'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sm3'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sm4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cooperlake'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cooperlake-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cooperlake-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Denverton'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mpx'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Denverton-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mpx'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Denverton-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Denverton-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Dhyana-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Genoa'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='perfmon-v2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Milan'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Milan-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Milan-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Milan-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Rome'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Rome-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Rome-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Rome-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Turin'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vp2intersect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibpb-brtype'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='perfmon-v2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbpb'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='srso-user-kernel-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Turin-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vp2intersect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibpb-brtype'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='perfmon-v2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbpb'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='srso-user-kernel-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-v5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='GraniteRapids'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='GraniteRapids-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='GraniteRapids-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-128'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-256'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-512'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='GraniteRapids-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-128'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-256'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-512'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-noTSX'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v6'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v7'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='IvyBridge'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='IvyBridge-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='IvyBridge-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='IvyBridge-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='KnightsMill'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-4fmaps'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-4vnniw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512er'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512pf'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='KnightsMill-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-4fmaps'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-4vnniw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512er'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512pf'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Opteron_G4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fma4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xop'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Opteron_G4-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fma4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xop'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Opteron_G5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fma4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tbm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xop'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Opteron_G5-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fma4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tbm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xop'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SierraForest'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SierraForest-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SierraForest-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='intel-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='lam'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SierraForest-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='intel-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='lam'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='core-capability'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mpx'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='split-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='core-capability'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mpx'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='split-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='core-capability'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='split-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='core-capability'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='split-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='athlon'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnow'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnowext'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='athlon-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnow'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnowext'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='core2duo'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='core2duo-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='coreduo'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='coreduo-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='n270'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='n270-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='phenom'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnow'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnowext'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='phenom-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnow'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnowext'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </mode>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </cpu>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <memoryBacking supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <enum name='sourceType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>file</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>anonymous</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>memfd</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </memoryBacking>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <devices>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <disk supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='diskDevice'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>disk</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>cdrom</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>floppy</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>lun</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='bus'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>ide</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>fdc</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>scsi</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>usb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>sata</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio-transitional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio-non-transitional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </disk>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <graphics supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vnc</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>egl-headless</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>dbus</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </graphics>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <video supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='modelType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vga</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>cirrus</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>none</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>bochs</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>ramfb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </video>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <hostdev supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='mode'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>subsystem</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='startupPolicy'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>default</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>mandatory</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>requisite</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>optional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='subsysType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>usb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pci</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>scsi</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='capsType'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='pciBackend'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </hostdev>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <rng supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio-transitional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio-non-transitional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendModel'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>random</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>egd</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>builtin</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </rng>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <filesystem supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='driverType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>path</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>handle</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtiofs</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </filesystem>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <tpm supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>tpm-tis</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>tpm-crb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendModel'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>emulator</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>external</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendVersion'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>2.0</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </tpm>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <redirdev supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='bus'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>usb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </redirdev>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <channel supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pty</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>unix</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </channel>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <crypto supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>qemu</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendModel'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>builtin</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </crypto>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <interface supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>default</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>passt</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </interface>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <panic supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>isa</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>hyperv</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </panic>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <console supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>null</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vc</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pty</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>dev</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>file</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pipe</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>stdio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>udp</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>tcp</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>unix</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>qemu-vdagent</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>dbus</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </console>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </devices>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <features>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <gic supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <vmcoreinfo supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <genid supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <backingStoreInput supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <backup supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <async-teardown supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <s390-pv supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <ps2 supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <tdx supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <sev supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <sgx supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <hyperv supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='features'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>relaxed</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vapic</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>spinlocks</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vpindex</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>runtime</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>synic</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>stimer</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>reset</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vendor_id</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>frequencies</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>reenlightenment</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>tlbflush</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>ipi</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>avic</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>emsr_bitmap</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>xmm_input</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <defaults>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <spinlocks>4095</spinlocks>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <stimer_direct>on</stimer_direct>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </defaults>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </hyperv>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <launchSecurity supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </features>
Jan 26 08:36:47 compute-1 nova_compute[182165]: </domainCapabilities>
Jan 26 08:36:47 compute-1 nova_compute[182165]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.600 182169 DEBUG nova.virt.libvirt.host [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 26 08:36:47 compute-1 nova_compute[182165]: <domainCapabilities>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <domain>kvm</domain>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <arch>x86_64</arch>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <vcpu max='4096'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <iothreads supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <os supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <enum name='firmware'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>efi</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <loader supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>rom</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pflash</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='readonly'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>yes</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>no</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='secure'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>yes</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>no</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </loader>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </os>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <cpu>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <mode name='host-passthrough' supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='hostPassthroughMigratable'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>on</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>off</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </mode>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <mode name='maximum' supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='maximumMigratable'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>on</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>off</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </mode>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <mode name='host-model' supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <vendor>AMD</vendor>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='x2apic'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='hypervisor'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='stibp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='ssbd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='overflow-recov'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='succor'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='lbrv'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='tsc-scale'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='flushbyasid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='pause-filter'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='pfthreshold'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <feature policy='disable' name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </mode>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <mode name='custom' supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-noTSX'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Broadwell-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='ClearwaterForest'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ddpd-u'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='intel-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='lam'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sha512'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sm3'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sm4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='ClearwaterForest-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ddpd-u'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='intel-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='lam'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sha512'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sm3'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sm4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cooperlake'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cooperlake-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Cooperlake-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Denverton'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mpx'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Denverton-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mpx'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Denverton-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Denverton-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Dhyana-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Genoa'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='perfmon-v2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Milan'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Milan-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Milan-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Milan-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Rome'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Rome-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Rome-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Rome-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Turin'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vp2intersect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibpb-brtype'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='perfmon-v2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbpb'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='srso-user-kernel-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-Turin-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amd-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='auto-ibrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vp2intersect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibpb-brtype'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='perfmon-v2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbpb'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='srso-user-kernel-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='stibp-always-on'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='EPYC-v5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='GraniteRapids'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='GraniteRapids-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='GraniteRapids-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-128'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-256'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-512'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='GraniteRapids-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-128'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-256'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx10-512'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='prefetchiti'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-noTSX'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Haswell-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v6'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Icelake-Server-v7'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='IvyBridge'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='IvyBridge-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='IvyBridge-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='IvyBridge-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='KnightsMill'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-4fmaps'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-4vnniw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512er'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512pf'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='KnightsMill-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-4fmaps'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-4vnniw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512er'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512pf'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Opteron_G4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fma4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xop'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Opteron_G4-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fma4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xop'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Opteron_G5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fma4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tbm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xop'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Opteron_G5-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fma4'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tbm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xop'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SapphireRapids-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='amx-tile'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-bf16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-fp16'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bitalg'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrc'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fzrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='la57'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='taa-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SierraForest'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SierraForest-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SierraForest-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='intel-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='lam'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='SierraForest-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ifma'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cmpccxadd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fbsdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='fsrs'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ibrs-all'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='intel-psfd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='lam'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mcdt-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pbrsb-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='psdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='serialize'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vaes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Client-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='hle'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='rtm'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Skylake-Server-v5'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512bw'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512cd'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512dq'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512f'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='avx512vl'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='invpcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pcid'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='pku'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='core-capability'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mpx'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='split-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='core-capability'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='mpx'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='split-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge-v2'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='core-capability'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='split-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge-v3'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='core-capability'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='split-lock-detect'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='Snowridge-v4'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='cldemote'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='erms'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='gfni'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdir64b'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='movdiri'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='xsaves'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='athlon'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnow'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnowext'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='athlon-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnow'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnowext'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='core2duo'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='core2duo-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='coreduo'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='coreduo-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='n270'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='n270-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='ss'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='phenom'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnow'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnowext'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <blockers model='phenom-v1'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnow'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <feature name='3dnowext'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </blockers>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </mode>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </cpu>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <memoryBacking supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <enum name='sourceType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>file</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>anonymous</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <value>memfd</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </memoryBacking>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <devices>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <disk supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='diskDevice'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>disk</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>cdrom</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>floppy</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>lun</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='bus'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>fdc</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>scsi</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>usb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>sata</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio-transitional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio-non-transitional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </disk>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <graphics supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vnc</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>egl-headless</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>dbus</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </graphics>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <video supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='modelType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vga</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>cirrus</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>none</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>bochs</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>ramfb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </video>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <hostdev supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='mode'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>subsystem</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='startupPolicy'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>default</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>mandatory</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>requisite</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>optional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='subsysType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>usb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pci</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>scsi</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='capsType'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='pciBackend'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </hostdev>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <rng supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio-transitional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtio-non-transitional</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendModel'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>random</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>egd</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>builtin</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </rng>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <filesystem supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='driverType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>path</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>handle</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>virtiofs</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </filesystem>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <tpm supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>tpm-tis</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>tpm-crb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendModel'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>emulator</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>external</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendVersion'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>2.0</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </tpm>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <redirdev supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='bus'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>usb</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </redirdev>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <channel supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pty</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>unix</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </channel>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <crypto supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>qemu</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendModel'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>builtin</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </crypto>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <interface supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='backendType'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>default</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>passt</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </interface>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <panic supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='model'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>isa</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>hyperv</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </panic>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <console supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='type'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>null</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vc</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pty</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>dev</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>file</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>pipe</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>stdio</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>udp</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>tcp</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>unix</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>qemu-vdagent</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>dbus</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </console>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </devices>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   <features>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <gic supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <vmcoreinfo supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <genid supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <backingStoreInput supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <backup supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <async-teardown supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <s390-pv supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <ps2 supported='yes'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <tdx supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <sev supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <sgx supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <hyperv supported='yes'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <enum name='features'>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>relaxed</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vapic</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>spinlocks</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vpindex</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>runtime</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>synic</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>stimer</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>reset</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>vendor_id</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>frequencies</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>reenlightenment</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>tlbflush</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>ipi</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>avic</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>emsr_bitmap</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <value>xmm_input</value>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </enum>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       <defaults>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <spinlocks>4095</spinlocks>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <stimer_direct>on</stimer_direct>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 08:36:47 compute-1 nova_compute[182165]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 08:36:47 compute-1 nova_compute[182165]:       </defaults>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     </hyperv>
Jan 26 08:36:47 compute-1 nova_compute[182165]:     <launchSecurity supported='no'/>
Jan 26 08:36:47 compute-1 nova_compute[182165]:   </features>
Jan 26 08:36:47 compute-1 nova_compute[182165]: </domainCapabilities>
Jan 26 08:36:47 compute-1 nova_compute[182165]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.676 182169 DEBUG nova.virt.libvirt.host [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.676 182169 DEBUG nova.virt.libvirt.host [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.677 182169 DEBUG nova.virt.libvirt.host [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.680 182169 INFO nova.virt.libvirt.host [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Secure Boot support detected
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.682 182169 INFO nova.virt.libvirt.driver [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.683 182169 INFO nova.virt.libvirt.driver [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.698 182169 DEBUG nova.virt.libvirt.driver [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.786 182169 INFO nova.virt.node [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Determined node identity 5203935e-446c-4e03-93fa-4c60d651e045 from /var/lib/nova/compute_id
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.810 182169 WARNING nova.compute.manager [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Compute nodes ['5203935e-446c-4e03-93fa-4c60d651e045'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 26 08:36:47 compute-1 python3.9[183023]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.877 182169 INFO nova.compute.manager [None req-257acea2-b228-4e06-a436-2e23740637a8 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 26 08:36:47 compute-1 systemd[1]: Stopping nova_compute container...
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.968 182169 INFO oslo_messaging._drivers.amqpdriver [-] No calling threads waiting for msg_id : 8ad2258415e24299bcb8fd701543a39a
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.969 182169 DEBUG oslo_concurrency.lockutils [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.969 182169 DEBUG oslo_concurrency.lockutils [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:36:47 compute-1 nova_compute[182165]: 2026-01-26 08:36:47.969 182169 DEBUG oslo_concurrency.lockutils [None req-c90ab83f-a6a4-4bb3-8710-049cf352d3ba - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:36:48 compute-1 virtqemud[182752]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 26 08:36:48 compute-1 virtqemud[182752]: hostname: compute-1
Jan 26 08:36:48 compute-1 virtqemud[182752]: End of file while reading data: Input/output error
Jan 26 08:36:48 compute-1 systemd[1]: libpod-8f96135535b280f182dc1f45e2c8ecacc1c65e2eb6ca70622f64f1c31d5768dd.scope: Deactivated successfully.
Jan 26 08:36:48 compute-1 systemd[1]: libpod-8f96135535b280f182dc1f45e2c8ecacc1c65e2eb6ca70622f64f1c31d5768dd.scope: Consumed 3.037s CPU time.
Jan 26 08:36:48 compute-1 podman[183027]: 2026-01-26 08:36:48.33473916 +0000 UTC m=+0.406970147 container died 8f96135535b280f182dc1f45e2c8ecacc1c65e2eb6ca70622f64f1c31d5768dd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251202, container_name=nova_compute, org.label-schema.schema-version=1.0)
Jan 26 08:36:48 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8f96135535b280f182dc1f45e2c8ecacc1c65e2eb6ca70622f64f1c31d5768dd-userdata-shm.mount: Deactivated successfully.
Jan 26 08:36:48 compute-1 systemd[1]: var-lib-containers-storage-overlay-853db5c305a65b2c72404f25a0ad80e9e27f1f1e3499bb8f9f0d18250b4276f1-merged.mount: Deactivated successfully.
Jan 26 08:36:48 compute-1 podman[183027]: 2026-01-26 08:36:48.44951189 +0000 UTC m=+0.521742897 container cleanup 8f96135535b280f182dc1f45e2c8ecacc1c65e2eb6ca70622f64f1c31d5768dd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 08:36:48 compute-1 podman[183027]: nova_compute
Jan 26 08:36:48 compute-1 podman[183054]: nova_compute
Jan 26 08:36:48 compute-1 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 26 08:36:48 compute-1 systemd[1]: Stopped nova_compute container.
Jan 26 08:36:48 compute-1 systemd[1]: Starting nova_compute container...
Jan 26 08:36:48 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:36:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/853db5c305a65b2c72404f25a0ad80e9e27f1f1e3499bb8f9f0d18250b4276f1/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 26 08:36:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/853db5c305a65b2c72404f25a0ad80e9e27f1f1e3499bb8f9f0d18250b4276f1/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 26 08:36:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/853db5c305a65b2c72404f25a0ad80e9e27f1f1e3499bb8f9f0d18250b4276f1/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 26 08:36:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/853db5c305a65b2c72404f25a0ad80e9e27f1f1e3499bb8f9f0d18250b4276f1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 26 08:36:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/853db5c305a65b2c72404f25a0ad80e9e27f1f1e3499bb8f9f0d18250b4276f1/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 26 08:36:48 compute-1 podman[183067]: 2026-01-26 08:36:48.678474294 +0000 UTC m=+0.135256102 container init 8f96135535b280f182dc1f45e2c8ecacc1c65e2eb6ca70622f64f1c31d5768dd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 26 08:36:48 compute-1 podman[183067]: 2026-01-26 08:36:48.684958139 +0000 UTC m=+0.141739927 container start 8f96135535b280f182dc1f45e2c8ecacc1c65e2eb6ca70622f64f1c31d5768dd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:36:48 compute-1 podman[183067]: nova_compute
Jan 26 08:36:48 compute-1 nova_compute[183083]: + sudo -E kolla_set_configs
Jan 26 08:36:48 compute-1 systemd[1]: Started nova_compute container.
Jan 26 08:36:48 compute-1 sudo[183021]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Validating config file
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Copying service configuration files
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Deleting /etc/ceph
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Creating directory /etc/ceph
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Setting permission for /etc/ceph
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Writing out command to execute
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 08:36:48 compute-1 nova_compute[183083]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 08:36:48 compute-1 nova_compute[183083]: ++ cat /run_command
Jan 26 08:36:48 compute-1 nova_compute[183083]: + CMD=nova-compute
Jan 26 08:36:48 compute-1 nova_compute[183083]: + ARGS=
Jan 26 08:36:48 compute-1 nova_compute[183083]: + sudo kolla_copy_cacerts
Jan 26 08:36:48 compute-1 nova_compute[183083]: + [[ ! -n '' ]]
Jan 26 08:36:48 compute-1 nova_compute[183083]: + . kolla_extend_start
Jan 26 08:36:48 compute-1 nova_compute[183083]: + echo 'Running command: '\''nova-compute'\'''
Jan 26 08:36:48 compute-1 nova_compute[183083]: Running command: 'nova-compute'
Jan 26 08:36:48 compute-1 nova_compute[183083]: + umask 0022
Jan 26 08:36:48 compute-1 nova_compute[183083]: + exec nova-compute
Jan 26 08:36:49 compute-1 sudo[183244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koxilvliwchotivcyeymxvyirimamksd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416609.0495505-2443-15457161093724/AnsiballZ_podman_container.py'
Jan 26 08:36:49 compute-1 sudo[183244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:49 compute-1 python3.9[183246]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 26 08:36:49 compute-1 systemd[1]: Started libpod-conmon-51e970b76d0c7b5ad916b10ba5ea51ec0d6abd2c52bc170b9cf52ad0431fa110.scope.
Jan 26 08:36:49 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:36:49 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/962007b6dc27f75a69cd5865c526a39a08d62e673b7df03ec66c90aa696457a8/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 26 08:36:49 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/962007b6dc27f75a69cd5865c526a39a08d62e673b7df03ec66c90aa696457a8/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 26 08:36:49 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/962007b6dc27f75a69cd5865c526a39a08d62e673b7df03ec66c90aa696457a8/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 26 08:36:49 compute-1 podman[183272]: 2026-01-26 08:36:49.981259146 +0000 UTC m=+0.212685187 container init 51e970b76d0c7b5ad916b10ba5ea51ec0d6abd2c52bc170b9cf52ad0431fa110 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 08:36:49 compute-1 podman[183272]: 2026-01-26 08:36:49.9899432 +0000 UTC m=+0.221369191 container start 51e970b76d0c7b5ad916b10ba5ea51ec0d6abd2c52bc170b9cf52ad0431fa110 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 08:36:50 compute-1 python3.9[183246]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 26 08:36:50 compute-1 nova_compute_init[183294]: INFO:nova_statedir:Applying nova statedir ownership
Jan 26 08:36:50 compute-1 nova_compute_init[183294]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 26 08:36:50 compute-1 nova_compute_init[183294]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 26 08:36:50 compute-1 nova_compute_init[183294]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 26 08:36:50 compute-1 nova_compute_init[183294]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 26 08:36:50 compute-1 nova_compute_init[183294]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 26 08:36:50 compute-1 nova_compute_init[183294]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 26 08:36:50 compute-1 nova_compute_init[183294]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 26 08:36:50 compute-1 nova_compute_init[183294]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 26 08:36:50 compute-1 nova_compute_init[183294]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 26 08:36:50 compute-1 nova_compute_init[183294]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 26 08:36:50 compute-1 nova_compute_init[183294]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 26 08:36:50 compute-1 nova_compute_init[183294]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 26 08:36:50 compute-1 nova_compute_init[183294]: INFO:nova_statedir:Nova statedir ownership complete
Jan 26 08:36:50 compute-1 systemd[1]: libpod-51e970b76d0c7b5ad916b10ba5ea51ec0d6abd2c52bc170b9cf52ad0431fa110.scope: Deactivated successfully.
Jan 26 08:36:50 compute-1 podman[183295]: 2026-01-26 08:36:50.069464421 +0000 UTC m=+0.044935861 container died 51e970b76d0c7b5ad916b10ba5ea51ec0d6abd2c52bc170b9cf52ad0431fa110 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 08:36:50 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-51e970b76d0c7b5ad916b10ba5ea51ec0d6abd2c52bc170b9cf52ad0431fa110-userdata-shm.mount: Deactivated successfully.
Jan 26 08:36:50 compute-1 systemd[1]: var-lib-containers-storage-overlay-962007b6dc27f75a69cd5865c526a39a08d62e673b7df03ec66c90aa696457a8-merged.mount: Deactivated successfully.
Jan 26 08:36:50 compute-1 podman[183301]: 2026-01-26 08:36:50.3235139 +0000 UTC m=+0.261710397 container cleanup 51e970b76d0c7b5ad916b10ba5ea51ec0d6abd2c52bc170b9cf52ad0431fa110 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, io.buildah.version=1.41.3)
Jan 26 08:36:50 compute-1 systemd[1]: libpod-conmon-51e970b76d0c7b5ad916b10ba5ea51ec0d6abd2c52bc170b9cf52ad0431fa110.scope: Deactivated successfully.
Jan 26 08:36:50 compute-1 sudo[183244]: pam_unix(sudo:session): session closed for user root
Jan 26 08:36:50 compute-1 nova_compute[183083]: 2026-01-26 08:36:50.783 183087 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 26 08:36:50 compute-1 nova_compute[183083]: 2026-01-26 08:36:50.783 183087 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 26 08:36:50 compute-1 nova_compute[183083]: 2026-01-26 08:36:50.784 183087 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 26 08:36:50 compute-1 nova_compute[183083]: 2026-01-26 08:36:50.784 183087 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 26 08:36:50 compute-1 nova_compute[183083]: 2026-01-26 08:36:50.914 183087 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:36:50 compute-1 nova_compute[183083]: 2026-01-26 08:36:50.940 183087 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:36:50 compute-1 nova_compute[183083]: 2026-01-26 08:36:50.941 183087 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 26 08:36:50 compute-1 sshd-session[159982]: Connection closed by 192.168.122.30 port 52888
Jan 26 08:36:50 compute-1 sshd-session[159979]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:36:50 compute-1 systemd[1]: session-25.scope: Deactivated successfully.
Jan 26 08:36:50 compute-1 systemd[1]: session-25.scope: Consumed 1min 54.941s CPU time.
Jan 26 08:36:50 compute-1 systemd-logind[788]: Session 25 logged out. Waiting for processes to exit.
Jan 26 08:36:50 compute-1 systemd-logind[788]: Removed session 25.
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.404 183087 INFO nova.virt.driver [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.540 183087 INFO nova.compute.provider_config [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.561 183087 DEBUG oslo_concurrency.lockutils [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.562 183087 DEBUG oslo_concurrency.lockutils [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.562 183087 DEBUG oslo_concurrency.lockutils [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.562 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.562 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.563 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.563 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.563 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.563 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.563 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.563 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.563 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.564 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.564 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.564 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.564 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.564 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.564 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.565 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.565 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.565 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.565 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.565 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.565 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.566 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.566 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.566 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.566 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.566 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.566 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.567 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.567 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.567 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.567 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.567 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.567 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.568 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.568 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.568 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.568 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.568 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.568 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.568 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.569 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.569 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.569 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.569 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.569 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.569 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.570 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.570 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.570 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.570 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.570 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.570 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.570 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.570 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.571 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.571 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.571 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.571 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.571 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.571 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.571 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.572 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.572 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.572 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.572 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.572 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.572 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.572 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.572 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.573 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.573 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.573 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.573 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.573 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.573 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.573 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.574 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.574 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.574 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.574 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.574 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.574 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.574 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.574 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.575 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.575 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.575 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.575 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.575 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.575 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.575 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.576 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.576 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.576 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.576 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.576 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.576 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.576 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.576 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.577 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.577 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.577 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.577 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.577 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.577 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.577 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.578 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.578 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.578 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.578 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.578 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.578 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.578 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.578 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.579 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.579 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.579 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.579 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.579 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.579 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.579 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.579 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.580 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.580 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.580 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.580 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.580 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.580 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.580 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.581 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.581 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.581 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.581 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.581 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.581 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.581 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.581 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.582 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.582 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.582 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.582 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.582 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.582 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.582 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.583 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.583 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.583 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.583 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.583 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.583 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.583 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.584 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.584 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.584 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.584 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.584 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.584 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.584 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.584 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.585 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.585 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.585 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.585 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.585 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.585 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.585 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.586 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.586 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.586 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.586 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.586 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.586 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.586 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.587 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.587 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.587 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.587 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.587 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.587 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.587 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.588 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.588 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.588 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.588 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.588 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.588 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.588 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.589 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.589 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.589 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.589 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.589 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.589 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.589 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.590 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.590 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.590 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.590 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.590 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.590 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.590 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.591 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.591 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.591 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.591 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.591 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.591 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.591 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.591 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.592 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.592 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.592 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.592 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.592 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.592 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.592 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.593 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.593 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.593 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.593 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.593 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.593 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.594 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.594 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.594 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.594 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.594 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.594 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.594 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.595 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.595 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.595 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.595 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.595 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.595 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.596 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.596 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.596 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.596 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.596 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.596 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.597 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.597 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.597 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.597 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.597 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.597 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.598 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.598 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.598 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.598 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.598 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.598 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.598 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.599 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.599 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.599 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.599 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.599 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.600 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.600 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.600 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.600 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.600 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.601 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.601 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.601 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.601 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.601 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.602 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.602 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.602 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.602 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.602 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.602 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.603 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.603 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.603 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.603 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.603 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.604 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.604 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.604 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.604 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.604 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.605 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.605 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.605 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.605 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.605 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.605 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.605 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.606 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.606 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.606 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.606 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.606 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.606 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.606 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.607 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.607 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.607 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.607 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.607 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.607 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.607 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.608 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.608 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.608 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.608 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.608 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.608 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.608 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.608 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.609 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.609 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.609 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.609 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.609 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.609 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.609 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.610 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.610 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.610 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.610 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.610 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.610 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.610 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.611 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.611 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.611 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.611 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.611 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.611 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.611 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.611 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.612 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.612 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.612 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.612 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.612 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.612 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.612 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.613 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.613 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.613 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.613 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.613 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.613 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.613 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.614 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.614 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.614 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.614 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.614 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.614 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.615 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.615 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.615 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.615 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.615 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.615 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.615 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.616 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.616 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.616 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.616 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.616 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.616 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.616 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.616 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.617 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.617 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.617 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.617 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.617 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.617 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.617 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.618 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.618 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.618 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.618 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.618 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.618 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.618 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.618 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.619 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.619 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.619 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.619 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.619 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.619 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.619 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.620 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.620 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.620 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.620 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.620 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.620 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.620 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.621 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.621 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.621 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.621 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.621 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.621 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.621 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.622 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.622 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.622 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.622 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.622 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.622 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.622 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.622 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.623 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.623 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.623 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.623 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.623 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.623 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.623 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.623 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.624 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.624 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.624 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.624 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.624 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.624 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.624 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.625 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.625 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.625 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.625 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.625 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.625 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.625 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.625 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.626 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.626 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.626 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.626 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.626 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.626 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.626 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.627 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.627 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.627 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.627 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.627 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.627 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.627 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.627 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.628 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.628 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.628 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.628 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.628 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.628 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.628 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.629 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.629 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.629 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.629 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.629 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.629 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.629 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.630 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.630 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.630 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.630 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.630 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.630 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.630 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.630 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.631 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.631 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.631 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.631 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.631 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.631 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.631 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.632 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.632 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.632 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.632 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.632 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.632 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.632 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.633 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.633 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.633 183087 WARNING oslo_config.cfg [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 26 08:36:51 compute-1 nova_compute[183083]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 26 08:36:51 compute-1 nova_compute[183083]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 26 08:36:51 compute-1 nova_compute[183083]: and ``live_migration_inbound_addr`` respectively.
Jan 26 08:36:51 compute-1 nova_compute[183083]: ).  Its value may be silently ignored in the future.
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.633 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.633 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.633 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.634 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.634 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.634 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.634 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.634 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.634 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.634 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.635 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.635 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.635 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.635 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.635 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.635 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.635 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.636 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.636 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.636 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.636 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.636 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.636 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.636 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.636 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.637 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.637 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.637 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.637 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.637 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.637 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.638 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.638 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.638 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.638 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.638 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.638 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.638 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.639 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.639 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.639 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.639 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.639 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.639 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.639 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.639 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.640 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.640 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.640 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.640 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.640 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.640 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.640 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.641 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.641 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.641 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.641 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.641 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.641 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.641 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.641 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.642 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.642 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.642 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.642 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.642 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.642 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.642 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.643 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.643 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.643 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.643 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.643 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.643 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.643 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.644 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.644 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.644 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.644 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.644 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.644 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.644 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.644 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.645 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.645 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.645 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.645 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.645 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.645 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.645 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.646 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.646 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.646 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.646 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.646 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.646 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.646 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.646 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.647 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.647 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.647 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.647 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.647 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.647 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.647 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.648 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.648 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.648 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.648 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.648 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.648 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.648 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.648 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.649 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.649 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.649 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.649 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.649 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.649 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.649 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.650 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.650 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.650 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.650 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.650 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.650 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.650 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.651 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.651 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.651 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.651 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.651 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.651 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.651 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.651 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.652 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.652 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.652 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.652 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.652 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.652 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.652 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.653 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.653 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.653 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.653 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.653 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.653 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.653 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.654 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.654 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.654 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.654 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.654 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.654 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.654 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.655 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.657 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.657 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.657 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.658 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.658 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.658 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.658 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.658 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.659 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.659 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.659 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.659 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.659 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.660 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.660 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.660 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.660 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.660 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.661 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.661 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.661 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.661 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.661 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.662 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.662 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.662 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.662 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.663 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.663 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.663 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.663 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.663 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.664 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.664 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.664 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.664 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.664 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.664 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.665 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.665 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.665 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.665 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.666 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.666 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.666 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.666 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.666 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.667 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.667 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.667 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.667 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.667 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.668 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.668 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.668 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.668 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.668 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.669 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.669 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.669 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.669 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.669 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.669 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.670 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.670 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.670 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.670 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.670 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.671 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.671 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.671 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.671 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.671 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.672 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.672 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.672 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.672 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.672 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.673 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.673 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.673 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.673 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.673 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.674 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.674 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.674 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.674 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.674 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.675 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.675 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.675 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.675 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.676 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.676 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.676 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.676 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.676 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.677 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.677 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.677 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.677 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.677 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.678 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.678 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.678 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.678 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.678 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.679 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.679 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.679 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.679 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.679 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.680 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.680 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.680 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.680 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.680 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.680 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.681 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.681 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.681 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.681 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.682 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.682 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.682 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.682 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.682 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.683 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.683 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.683 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.683 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.683 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.684 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.684 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.684 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.684 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.684 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.685 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.685 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.685 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.685 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.685 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.686 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.686 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.686 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.686 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.686 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.687 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.687 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.687 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.687 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.687 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.687 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.688 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.688 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.688 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.688 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.688 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.689 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.689 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.689 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.689 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.689 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.690 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.690 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.690 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.690 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.690 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.691 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.691 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.691 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.691 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.691 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.691 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.692 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.692 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.692 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.692 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.692 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.693 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.693 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.693 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.693 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.694 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.694 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.694 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.694 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.695 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.695 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.695 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.695 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.695 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.696 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.696 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.696 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.696 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.696 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.696 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.697 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.697 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.697 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.697 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.697 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.698 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.698 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.698 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.698 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.698 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.699 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.699 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.699 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.699 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.699 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.700 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.700 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.700 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.700 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.700 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.700 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.701 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.701 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.701 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.701 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.701 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.702 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.702 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.702 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.702 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.702 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.703 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.703 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.703 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.703 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.703 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.704 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.704 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.704 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.704 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.704 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.705 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.705 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.705 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.705 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.705 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.705 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.706 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.706 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.706 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.706 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.706 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.707 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.707 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.707 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.707 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.707 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.708 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.708 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.708 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.708 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.708 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.709 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.709 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.709 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.709 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.709 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.709 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.710 183087 DEBUG oslo_service.service [None req-0cf4d353-011f-437f-8516-404cbc2e791c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.711 183087 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.815 183087 INFO nova.virt.node [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Determined node identity 5203935e-446c-4e03-93fa-4c60d651e045 from /var/lib/nova/compute_id
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.816 183087 DEBUG nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.817 183087 DEBUG nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.817 183087 DEBUG nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.817 183087 DEBUG nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.830 183087 DEBUG nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f6cc810ae50> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.832 183087 DEBUG nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f6cc810ae50> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.833 183087 INFO nova.virt.libvirt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Connection event '1' reason 'None'
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.839 183087 INFO nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Libvirt host capabilities <capabilities>
Jan 26 08:36:51 compute-1 nova_compute[183083]: 
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <host>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <uuid>99f84307-5f5c-4de6-9e22-fda82fab04a3</uuid>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <cpu>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <arch>x86_64</arch>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model>EPYC-Rome-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <vendor>AMD</vendor>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <microcode version='16777317'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <signature family='23' model='49' stepping='0'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='x2apic'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='tsc-deadline'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='osxsave'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='hypervisor'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='tsc_adjust'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='spec-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='stibp'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='arch-capabilities'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='ssbd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='cmp_legacy'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='topoext'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='virt-ssbd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='lbrv'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='tsc-scale'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='vmcb-clean'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='pause-filter'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='pfthreshold'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='svme-addr-chk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='rdctl-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='skip-l1dfl-vmentry'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='mds-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature name='pschange-mc-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <pages unit='KiB' size='4'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <pages unit='KiB' size='2048'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <pages unit='KiB' size='1048576'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </cpu>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <power_management>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <suspend_mem/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <suspend_disk/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <suspend_hybrid/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </power_management>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <iommu support='no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <migration_features>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <live/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <uri_transports>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <uri_transport>tcp</uri_transport>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <uri_transport>rdma</uri_transport>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </uri_transports>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </migration_features>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <topology>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <cells num='1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <cell id='0'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:           <memory unit='KiB'>16109544</memory>
Jan 26 08:36:51 compute-1 nova_compute[183083]:           <pages unit='KiB' size='4'>4027386</pages>
Jan 26 08:36:51 compute-1 nova_compute[183083]:           <pages unit='KiB' size='2048'>0</pages>
Jan 26 08:36:51 compute-1 nova_compute[183083]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 26 08:36:51 compute-1 nova_compute[183083]:           <distances>
Jan 26 08:36:51 compute-1 nova_compute[183083]:             <sibling id='0' value='10'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:           </distances>
Jan 26 08:36:51 compute-1 nova_compute[183083]:           <cpus num='8'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:           </cpus>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         </cell>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </cells>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </topology>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <cache>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </cache>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <secmodel>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model>selinux</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <doi>0</doi>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </secmodel>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <secmodel>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model>dac</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <doi>0</doi>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </secmodel>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   </host>
Jan 26 08:36:51 compute-1 nova_compute[183083]: 
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <guest>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <os_type>hvm</os_type>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <arch name='i686'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <wordsize>32</wordsize>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <domain type='qemu'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <domain type='kvm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </arch>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <features>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <pae/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <nonpae/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <acpi default='on' toggle='yes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <apic default='on' toggle='no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <cpuselection/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <deviceboot/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <disksnapshot default='on' toggle='no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <externalSnapshot/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </features>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   </guest>
Jan 26 08:36:51 compute-1 nova_compute[183083]: 
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <guest>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <os_type>hvm</os_type>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <arch name='x86_64'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <wordsize>64</wordsize>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <domain type='qemu'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <domain type='kvm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </arch>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <features>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <acpi default='on' toggle='yes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <apic default='on' toggle='no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <cpuselection/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <deviceboot/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <disksnapshot default='on' toggle='no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <externalSnapshot/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </features>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   </guest>
Jan 26 08:36:51 compute-1 nova_compute[183083]: 
Jan 26 08:36:51 compute-1 nova_compute[183083]: </capabilities>
Jan 26 08:36:51 compute-1 nova_compute[183083]: 
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.846 183087 DEBUG nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.850 183087 DEBUG nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 26 08:36:51 compute-1 nova_compute[183083]: <domainCapabilities>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <domain>kvm</domain>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <arch>i686</arch>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <vcpu max='4096'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <iothreads supported='yes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <os supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <enum name='firmware'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <loader supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>rom</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>pflash</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='readonly'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>yes</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>no</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='secure'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>no</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </loader>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   </os>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <cpu>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <mode name='host-passthrough' supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='hostPassthroughMigratable'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>on</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>off</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </mode>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <mode name='maximum' supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='maximumMigratable'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>on</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>off</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </mode>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <mode name='host-model' supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <vendor>AMD</vendor>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='x2apic'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='hypervisor'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='stibp'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='ssbd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='overflow-recov'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='succor'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='ibrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='lbrv'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='tsc-scale'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='flushbyasid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='pause-filter'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='pfthreshold'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='disable' name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </mode>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <mode name='custom' supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Broadwell'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Broadwell-IBRS'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Broadwell-noTSX'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Broadwell-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Broadwell-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Broadwell-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Broadwell-v4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='ClearwaterForest'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bhi-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ddpd-u'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='intel-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='lam'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sha512'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sm3'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sm4'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='ClearwaterForest-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bhi-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ddpd-u'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='intel-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='lam'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sha512'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sm3'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sm4'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cooperlake'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cooperlake-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cooperlake-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Denverton'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mpx'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Denverton-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mpx'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Denverton-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Denverton-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Dhyana-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Genoa'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='perfmon-v2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Milan'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Milan-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Milan-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Milan-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Rome'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Rome-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Rome-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Rome-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Turin'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vp2intersect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibpb-brtype'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='perfmon-v2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='prefetchi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbpb'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='srso-user-kernel-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Turin-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vp2intersect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibpb-brtype'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='perfmon-v2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='prefetchi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbpb'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='srso-user-kernel-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-v4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-v5'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='GraniteRapids'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='GraniteRapids-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='GraniteRapids-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx10'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx10-128'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx10-256'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx10-512'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='GraniteRapids-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx10'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx10-128'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx10-256'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx10-512'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Haswell'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Haswell-IBRS'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Haswell-noTSX'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Haswell-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Haswell-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Haswell-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Haswell-v4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v5'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v6'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v7'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='IvyBridge'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='IvyBridge-IBRS'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='IvyBridge-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='IvyBridge-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='KnightsMill'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-4fmaps'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-4vnniw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512er'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512pf'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='KnightsMill-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-4fmaps'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-4vnniw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512er'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512pf'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Opteron_G4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fma4'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xop'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Opteron_G4-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fma4'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xop'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Opteron_G5'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fma4'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tbm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xop'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Opteron_G5-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fma4'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tbm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xop'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids-v4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='SierraForest'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='SierraForest-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='SierraForest-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='intel-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='lam'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='SierraForest-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='intel-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='lam'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-v4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v5'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Snowridge'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='core-capability'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mpx'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='split-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Snowridge-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='core-capability'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mpx'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='split-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Snowridge-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='core-capability'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='split-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Snowridge-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='core-capability'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='split-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Snowridge-v4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='athlon'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='3dnow'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='3dnowext'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='athlon-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='3dnow'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='3dnowext'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='core2duo'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='core2duo-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='coreduo'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='coreduo-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='n270'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='n270-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='phenom'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='3dnow'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='3dnowext'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='phenom-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='3dnow'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='3dnowext'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </mode>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   </cpu>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <memoryBacking supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <enum name='sourceType'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <value>file</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <value>anonymous</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <value>memfd</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   </memoryBacking>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <devices>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <disk supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='diskDevice'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>disk</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>cdrom</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>floppy</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>lun</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='bus'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>fdc</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>scsi</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>virtio</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>usb</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>sata</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='model'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>virtio</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>virtio-transitional</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>virtio-non-transitional</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <graphics supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>vnc</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>egl-headless</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>dbus</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </graphics>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <video supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='modelType'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>vga</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>cirrus</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>virtio</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>none</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>bochs</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>ramfb</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </video>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <hostdev supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='mode'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>subsystem</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='startupPolicy'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>default</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>mandatory</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>requisite</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>optional</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='subsysType'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>usb</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>pci</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>scsi</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='capsType'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='pciBackend'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </hostdev>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <rng supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='model'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>virtio</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>virtio-transitional</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>virtio-non-transitional</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='backendModel'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>random</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>egd</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>builtin</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </rng>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <filesystem supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='driverType'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>path</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>handle</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>virtiofs</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </filesystem>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <tpm supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='model'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>tpm-tis</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>tpm-crb</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='backendModel'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>emulator</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>external</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='backendVersion'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>2.0</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </tpm>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <redirdev supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='bus'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>usb</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </redirdev>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <channel supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>pty</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>unix</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </channel>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <crypto supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='model'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>qemu</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='backendModel'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>builtin</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </crypto>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <interface supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='backendType'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>default</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>passt</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <panic supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='model'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>isa</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>hyperv</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </panic>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <console supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>null</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>vc</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>pty</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>dev</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>file</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>pipe</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>stdio</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>udp</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>tcp</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>unix</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>qemu-vdagent</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>dbus</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </console>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   </devices>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <features>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <gic supported='no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <vmcoreinfo supported='yes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <genid supported='yes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <backingStoreInput supported='yes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <backup supported='yes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <async-teardown supported='yes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <s390-pv supported='no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <ps2 supported='yes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <tdx supported='no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <sev supported='no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <sgx supported='no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <hyperv supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='features'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>relaxed</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>vapic</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>spinlocks</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>vpindex</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>runtime</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>synic</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>stimer</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>reset</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>vendor_id</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>frequencies</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>reenlightenment</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>tlbflush</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>ipi</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>avic</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>emsr_bitmap</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>xmm_input</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <defaults>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <spinlocks>4095</spinlocks>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <stimer_direct>on</stimer_direct>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </defaults>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </hyperv>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <launchSecurity supported='no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   </features>
Jan 26 08:36:51 compute-1 nova_compute[183083]: </domainCapabilities>
Jan 26 08:36:51 compute-1 nova_compute[183083]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 26 08:36:51 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.856 183087 DEBUG nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 26 08:36:51 compute-1 nova_compute[183083]: <domainCapabilities>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <domain>kvm</domain>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <arch>i686</arch>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <vcpu max='240'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <iothreads supported='yes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <os supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <enum name='firmware'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <loader supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>rom</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>pflash</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='readonly'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>yes</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>no</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='secure'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>no</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </loader>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   </os>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <cpu>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <mode name='host-passthrough' supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='hostPassthroughMigratable'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>on</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>off</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </mode>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <mode name='maximum' supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='maximumMigratable'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>on</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>off</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </mode>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <mode name='host-model' supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <vendor>AMD</vendor>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='x2apic'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='hypervisor'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='stibp'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='ssbd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='overflow-recov'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='succor'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='ibrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='lbrv'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='tsc-scale'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='flushbyasid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='pause-filter'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='pfthreshold'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <feature policy='disable' name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </mode>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <mode name='custom' supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Broadwell'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Broadwell-IBRS'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Broadwell-noTSX'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Broadwell-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Broadwell-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Broadwell-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Broadwell-v4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='ClearwaterForest'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bhi-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ddpd-u'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='intel-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='lam'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sha512'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sm3'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sm4'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='ClearwaterForest-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bhi-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ddpd-u'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='intel-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='lam'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sha512'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sm3'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sm4'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cooperlake'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cooperlake-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Cooperlake-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Denverton'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mpx'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Denverton-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mpx'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Denverton-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Denverton-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Dhyana-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Genoa'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='perfmon-v2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Milan'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Milan-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Milan-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Milan-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Rome'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Rome-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Rome-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Rome-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Turin'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vp2intersect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibpb-brtype'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='perfmon-v2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='prefetchi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbpb'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='srso-user-kernel-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-Turin-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vp2intersect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibpb-brtype'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='perfmon-v2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='prefetchi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbpb'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='srso-user-kernel-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-v4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='EPYC-v5'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='GraniteRapids'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='GraniteRapids-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='GraniteRapids-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx10'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx10-128'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx10-256'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx10-512'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='GraniteRapids-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx10'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx10-128'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx10-256'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx10-512'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Haswell'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Haswell-IBRS'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Haswell-noTSX'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Haswell-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Haswell-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Haswell-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Haswell-v4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v5'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v6'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v7'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='IvyBridge'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='IvyBridge-IBRS'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='IvyBridge-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='IvyBridge-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='KnightsMill'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-4fmaps'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-4vnniw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512er'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512pf'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='KnightsMill-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-4fmaps'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-4vnniw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512er'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512pf'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Opteron_G4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fma4'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xop'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Opteron_G4-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fma4'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xop'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Opteron_G5'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fma4'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tbm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xop'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Opteron_G5-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fma4'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tbm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xop'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids-v4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='SierraForest'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='SierraForest-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='SierraForest-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='intel-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='lam'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='SierraForest-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='intel-psfd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='lam'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-v4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v5'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Snowridge'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='core-capability'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mpx'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='split-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Snowridge-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='core-capability'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='mpx'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='split-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Snowridge-v2'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='core-capability'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='split-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Snowridge-v3'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='core-capability'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='split-lock-detect'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='Snowridge-v4'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='athlon'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='3dnow'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='3dnowext'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='athlon-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='3dnow'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='3dnowext'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='core2duo'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='core2duo-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='coreduo'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='coreduo-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='n270'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='n270-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='phenom'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='3dnow'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='3dnowext'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <blockers model='phenom-v1'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='3dnow'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <feature name='3dnowext'/>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </mode>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   </cpu>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <memoryBacking supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <enum name='sourceType'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <value>file</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <value>anonymous</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <value>memfd</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     </enum>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   </memoryBacking>
Jan 26 08:36:51 compute-1 nova_compute[183083]:   <devices>
Jan 26 08:36:51 compute-1 nova_compute[183083]:     <disk supported='yes'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:       <enum name='diskDevice'>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>disk</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>cdrom</value>
Jan 26 08:36:51 compute-1 nova_compute[183083]:         <value>floppy</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>lun</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='bus'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>ide</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>fdc</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>scsi</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>usb</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>sata</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='model'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio-transitional</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio-non-transitional</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <graphics supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>vnc</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>egl-headless</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>dbus</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </graphics>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <video supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='modelType'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>vga</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>cirrus</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>none</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>bochs</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>ramfb</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </video>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <hostdev supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='mode'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>subsystem</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='startupPolicy'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>default</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>mandatory</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>requisite</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>optional</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='subsysType'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>usb</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>pci</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>scsi</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='capsType'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='pciBackend'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </hostdev>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <rng supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='model'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio-transitional</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio-non-transitional</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='backendModel'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>random</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>egd</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>builtin</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </rng>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <filesystem supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='driverType'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>path</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>handle</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtiofs</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </filesystem>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <tpm supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='model'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>tpm-tis</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>tpm-crb</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='backendModel'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>emulator</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>external</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='backendVersion'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>2.0</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </tpm>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <redirdev supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='bus'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>usb</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </redirdev>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <channel supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>pty</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>unix</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </channel>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <crypto supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='model'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>qemu</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='backendModel'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>builtin</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </crypto>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <interface supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='backendType'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>default</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>passt</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <panic supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='model'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>isa</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>hyperv</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </panic>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <console supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>null</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>vc</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>pty</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>dev</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>file</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>pipe</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>stdio</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>udp</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>tcp</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>unix</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>qemu-vdagent</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>dbus</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </console>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   </devices>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <features>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <gic supported='no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <vmcoreinfo supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <genid supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <backingStoreInput supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <backup supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <async-teardown supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <s390-pv supported='no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <ps2 supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <tdx supported='no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <sev supported='no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <sgx supported='no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <hyperv supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='features'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>relaxed</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>vapic</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>spinlocks</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>vpindex</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>runtime</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>synic</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>stimer</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>reset</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>vendor_id</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>frequencies</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>reenlightenment</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>tlbflush</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>ipi</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>avic</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>emsr_bitmap</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>xmm_input</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <defaults>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <spinlocks>4095</spinlocks>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <stimer_direct>on</stimer_direct>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </defaults>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </hyperv>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <launchSecurity supported='no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   </features>
Jan 26 08:36:52 compute-1 nova_compute[183083]: </domainCapabilities>
Jan 26 08:36:52 compute-1 nova_compute[183083]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.935 183087 DEBUG nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.937 183087 DEBUG nova.virt.libvirt.volume.mount [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:51.940 183087 DEBUG nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 26 08:36:52 compute-1 nova_compute[183083]: <domainCapabilities>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <domain>kvm</domain>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <arch>x86_64</arch>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <vcpu max='4096'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <iothreads supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <os supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <enum name='firmware'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <value>efi</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <loader supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>rom</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>pflash</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='readonly'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>yes</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>no</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='secure'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>yes</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>no</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </loader>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   </os>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <cpu>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <mode name='host-passthrough' supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='hostPassthroughMigratable'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>on</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>off</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </mode>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <mode name='maximum' supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='maximumMigratable'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>on</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>off</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </mode>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <mode name='host-model' supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <vendor>AMD</vendor>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='x2apic'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='hypervisor'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='stibp'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='ssbd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='overflow-recov'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='succor'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='ibrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='lbrv'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='tsc-scale'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='flushbyasid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='pause-filter'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='pfthreshold'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='disable' name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </mode>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <mode name='custom' supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Broadwell'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Broadwell-IBRS'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Broadwell-noTSX'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Broadwell-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Broadwell-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Broadwell-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Broadwell-v4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='ClearwaterForest'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bhi-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ddpd-u'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='intel-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='lam'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sha512'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sm3'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sm4'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='ClearwaterForest-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bhi-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ddpd-u'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='intel-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='lam'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sha512'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sm3'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sm4'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cooperlake'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cooperlake-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cooperlake-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Denverton'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mpx'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Denverton-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mpx'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Denverton-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Denverton-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Dhyana-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Genoa'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='perfmon-v2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Milan'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Milan-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Milan-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Milan-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Rome'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Rome-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Rome-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Rome-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Turin'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vp2intersect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibpb-brtype'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='perfmon-v2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='prefetchi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbpb'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='srso-user-kernel-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Turin-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vp2intersect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibpb-brtype'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='perfmon-v2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='prefetchi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbpb'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='srso-user-kernel-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-v4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-v5'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='GraniteRapids'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='GraniteRapids-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='GraniteRapids-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx10'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx10-128'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx10-256'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx10-512'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='GraniteRapids-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx10'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx10-128'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx10-256'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx10-512'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Haswell'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Haswell-IBRS'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Haswell-noTSX'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Haswell-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Haswell-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Haswell-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Haswell-v4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v5'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v6'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v7'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='IvyBridge'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='IvyBridge-IBRS'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='IvyBridge-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='IvyBridge-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='KnightsMill'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-4fmaps'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-4vnniw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512er'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512pf'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='KnightsMill-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-4fmaps'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-4vnniw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512er'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512pf'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Opteron_G4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fma4'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xop'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Opteron_G4-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fma4'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xop'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Opteron_G5'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fma4'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tbm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xop'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Opteron_G5-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fma4'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tbm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xop'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids-v4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='SierraForest'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='SierraForest-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='SierraForest-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='intel-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='lam'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='SierraForest-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='intel-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='lam'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-v4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v5'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Snowridge'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='core-capability'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mpx'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='split-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Snowridge-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='core-capability'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mpx'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='split-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Snowridge-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='core-capability'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='split-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Snowridge-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='core-capability'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='split-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Snowridge-v4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='athlon'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='3dnow'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='3dnowext'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='athlon-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='3dnow'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='3dnowext'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='core2duo'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='core2duo-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='coreduo'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='coreduo-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='n270'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='n270-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='phenom'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='3dnow'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='3dnowext'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='phenom-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='3dnow'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='3dnowext'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </mode>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   </cpu>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <memoryBacking supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <enum name='sourceType'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <value>file</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <value>anonymous</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <value>memfd</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   </memoryBacking>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <devices>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <disk supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='diskDevice'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>disk</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>cdrom</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>floppy</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>lun</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='bus'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>fdc</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>scsi</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>usb</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>sata</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='model'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio-transitional</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio-non-transitional</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <graphics supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>vnc</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>egl-headless</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>dbus</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </graphics>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <video supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='modelType'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>vga</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>cirrus</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>none</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>bochs</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>ramfb</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </video>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <hostdev supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='mode'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>subsystem</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='startupPolicy'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>default</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>mandatory</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>requisite</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>optional</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='subsysType'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>usb</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>pci</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>scsi</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='capsType'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='pciBackend'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </hostdev>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <rng supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='model'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio-transitional</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio-non-transitional</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='backendModel'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>random</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>egd</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>builtin</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </rng>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <filesystem supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='driverType'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>path</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>handle</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtiofs</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </filesystem>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <tpm supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='model'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>tpm-tis</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>tpm-crb</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='backendModel'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>emulator</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>external</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='backendVersion'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>2.0</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </tpm>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <redirdev supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='bus'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>usb</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </redirdev>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <channel supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>pty</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>unix</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </channel>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <crypto supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='model'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>qemu</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='backendModel'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>builtin</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </crypto>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <interface supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='backendType'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>default</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>passt</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <panic supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='model'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>isa</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>hyperv</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </panic>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <console supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>null</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>vc</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>pty</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>dev</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>file</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>pipe</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>stdio</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>udp</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>tcp</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>unix</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>qemu-vdagent</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>dbus</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </console>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   </devices>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <features>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <gic supported='no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <vmcoreinfo supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <genid supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <backingStoreInput supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <backup supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <async-teardown supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <s390-pv supported='no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <ps2 supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <tdx supported='no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <sev supported='no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <sgx supported='no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <hyperv supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='features'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>relaxed</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>vapic</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>spinlocks</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>vpindex</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>runtime</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>synic</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>stimer</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>reset</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>vendor_id</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>frequencies</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>reenlightenment</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>tlbflush</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>ipi</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>avic</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>emsr_bitmap</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>xmm_input</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <defaults>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <spinlocks>4095</spinlocks>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <stimer_direct>on</stimer_direct>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </defaults>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </hyperv>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <launchSecurity supported='no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   </features>
Jan 26 08:36:52 compute-1 nova_compute[183083]: </domainCapabilities>
Jan 26 08:36:52 compute-1 nova_compute[183083]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.017 183087 DEBUG nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 26 08:36:52 compute-1 nova_compute[183083]: <domainCapabilities>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <domain>kvm</domain>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <arch>x86_64</arch>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <vcpu max='240'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <iothreads supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <os supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <enum name='firmware'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <loader supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>rom</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>pflash</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='readonly'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>yes</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>no</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='secure'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>no</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </loader>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   </os>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <cpu>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <mode name='host-passthrough' supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='hostPassthroughMigratable'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>on</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>off</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </mode>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <mode name='maximum' supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='maximumMigratable'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>on</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>off</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </mode>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <mode name='host-model' supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <vendor>AMD</vendor>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='x2apic'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='hypervisor'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='stibp'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='ssbd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='overflow-recov'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='succor'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='ibrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='lbrv'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='tsc-scale'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='flushbyasid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='pause-filter'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='pfthreshold'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <feature policy='disable' name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </mode>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <mode name='custom' supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Broadwell'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Broadwell-IBRS'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Broadwell-noTSX'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Broadwell-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Broadwell-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Broadwell-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Broadwell-v4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='ClearwaterForest'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bhi-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ddpd-u'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='intel-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='lam'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sha512'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sm3'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sm4'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='ClearwaterForest-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bhi-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ddpd-u'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='intel-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='lam'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sha512'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sm3'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sm4'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cooperlake'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cooperlake-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Cooperlake-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Denverton'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mpx'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Denverton-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mpx'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Denverton-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Denverton-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Dhyana-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Genoa'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='perfmon-v2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Milan'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Milan-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Milan-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Milan-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Rome'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Rome-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Rome-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Rome-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Turin'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vp2intersect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibpb-brtype'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='perfmon-v2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='prefetchi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbpb'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='srso-user-kernel-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-Turin-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amd-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='auto-ibrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vp2intersect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fs-gs-base-ns'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibpb-brtype'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='no-nested-data-bp'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='null-sel-clr-base'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='perfmon-v2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='prefetchi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbpb'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='srso-user-kernel-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='stibp-always-on'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-v4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='EPYC-v5'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='GraniteRapids'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='GraniteRapids-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='GraniteRapids-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx10'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx10-128'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx10-256'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx10-512'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='GraniteRapids-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx10'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx10-128'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx10-256'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx10-512'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='prefetchiti'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Haswell'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Haswell-IBRS'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Haswell-noTSX'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Haswell-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Haswell-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Haswell-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Haswell-v4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v5'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v6'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Icelake-Server-v7'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='IvyBridge'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='IvyBridge-IBRS'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='IvyBridge-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='IvyBridge-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='KnightsMill'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-4fmaps'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-4vnniw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512er'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512pf'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='KnightsMill-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-4fmaps'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-4vnniw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512er'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512pf'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Opteron_G4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fma4'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xop'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Opteron_G4-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fma4'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xop'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Opteron_G5'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fma4'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tbm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xop'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Opteron_G5-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fma4'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tbm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xop'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='SapphireRapids-v4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='amx-tile'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-bf16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-fp16'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512-vpopcntdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bitalg'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vbmi2'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrc'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fzrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='la57'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='taa-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='tsx-ldtrk'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='SierraForest'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='SierraForest-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='SierraForest-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='intel-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='lam'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='SierraForest-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ifma'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-ne-convert'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx-vnni-int8'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bhi-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='bus-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cmpccxadd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fbsdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='fsrs'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ibrs-all'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='intel-psfd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ipred-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='lam'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mcdt-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pbrsb-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='psdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rrsba-ctrl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='sbdr-ssdp-no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='serialize'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vaes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='vpclmulqdq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Client-v4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='hle'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='rtm'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Skylake-Server-v5'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512bw'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512cd'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512dq'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512f'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='avx512vl'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='invpcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pcid'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='pku'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Snowridge'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='core-capability'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mpx'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='split-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Snowridge-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='core-capability'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='mpx'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='split-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Snowridge-v2'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='core-capability'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='split-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Snowridge-v3'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='core-capability'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='split-lock-detect'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='Snowridge-v4'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='cldemote'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='erms'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='gfni'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdir64b'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='movdiri'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='xsaves'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='athlon'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='3dnow'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='3dnowext'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='athlon-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='3dnow'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='3dnowext'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='core2duo'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='core2duo-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='coreduo'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='coreduo-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='n270'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='n270-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='ss'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='phenom'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='3dnow'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='3dnowext'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <blockers model='phenom-v1'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='3dnow'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <feature name='3dnowext'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </blockers>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </mode>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   </cpu>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <memoryBacking supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <enum name='sourceType'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <value>file</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <value>anonymous</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <value>memfd</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   </memoryBacking>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <devices>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <disk supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='diskDevice'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>disk</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>cdrom</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>floppy</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>lun</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='bus'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>ide</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>fdc</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>scsi</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>usb</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>sata</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='model'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio-transitional</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio-non-transitional</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <graphics supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>vnc</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>egl-headless</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>dbus</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </graphics>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <video supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='modelType'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>vga</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>cirrus</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>none</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>bochs</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>ramfb</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </video>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <hostdev supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='mode'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>subsystem</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='startupPolicy'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>default</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>mandatory</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>requisite</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>optional</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='subsysType'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>usb</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>pci</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>scsi</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='capsType'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='pciBackend'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </hostdev>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <rng supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='model'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio-transitional</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtio-non-transitional</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='backendModel'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>random</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>egd</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>builtin</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </rng>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <filesystem supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='driverType'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>path</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>handle</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>virtiofs</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </filesystem>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <tpm supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='model'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>tpm-tis</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>tpm-crb</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='backendModel'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>emulator</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>external</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='backendVersion'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>2.0</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </tpm>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <redirdev supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='bus'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>usb</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </redirdev>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <channel supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>pty</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>unix</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </channel>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <crypto supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='model'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>qemu</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='backendModel'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>builtin</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </crypto>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <interface supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='backendType'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>default</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>passt</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <panic supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='model'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>isa</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>hyperv</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </panic>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <console supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='type'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>null</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>vc</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>pty</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>dev</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>file</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>pipe</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>stdio</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>udp</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>tcp</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>unix</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>qemu-vdagent</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>dbus</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </console>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   </devices>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   <features>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <gic supported='no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <vmcoreinfo supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <genid supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <backingStoreInput supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <backup supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <async-teardown supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <s390-pv supported='no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <ps2 supported='yes'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <tdx supported='no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <sev supported='no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <sgx supported='no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <hyperv supported='yes'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <enum name='features'>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>relaxed</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>vapic</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>spinlocks</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>vpindex</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>runtime</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>synic</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>stimer</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>reset</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>vendor_id</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>frequencies</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>reenlightenment</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>tlbflush</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>ipi</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>avic</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>emsr_bitmap</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <value>xmm_input</value>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </enum>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       <defaults>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <spinlocks>4095</spinlocks>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <stimer_direct>on</stimer_direct>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 08:36:52 compute-1 nova_compute[183083]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 08:36:52 compute-1 nova_compute[183083]:       </defaults>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     </hyperv>
Jan 26 08:36:52 compute-1 nova_compute[183083]:     <launchSecurity supported='no'/>
Jan 26 08:36:52 compute-1 nova_compute[183083]:   </features>
Jan 26 08:36:52 compute-1 nova_compute[183083]: </domainCapabilities>
Jan 26 08:36:52 compute-1 nova_compute[183083]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.080 183087 DEBUG nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.080 183087 INFO nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Secure Boot support detected
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.082 183087 INFO nova.virt.libvirt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.082 183087 INFO nova.virt.libvirt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.094 183087 DEBUG nova.virt.libvirt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.181 183087 INFO nova.virt.node [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Determined node identity 5203935e-446c-4e03-93fa-4c60d651e045 from /var/lib/nova/compute_id
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.222 183087 WARNING nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Compute nodes ['5203935e-446c-4e03-93fa-4c60d651e045'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.245 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.291 183087 WARNING nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.291 183087 DEBUG oslo_concurrency.lockutils [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.292 183087 DEBUG oslo_concurrency.lockutils [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.292 183087 DEBUG oslo_concurrency.lockutils [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.292 183087 DEBUG nova.compute.resource_tracker [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:36:52 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Jan 26 08:36:52 compute-1 systemd[1]: Started libvirt nodedev daemon.
Jan 26 08:36:52 compute-1 rsyslogd[1006]: imjournal from <np0005595389:nova_compute>: begin to drop messages due to rate-limiting
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.583 183087 WARNING nova.virt.libvirt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.584 183087 DEBUG nova.compute.resource_tracker [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=14240MB free_disk=113.30183792114258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.584 183087 DEBUG oslo_concurrency.lockutils [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.584 183087 DEBUG oslo_concurrency.lockutils [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.624 183087 WARNING nova.compute.resource_tracker [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] No compute node record for compute-1.ctlplane.example.com:5203935e-446c-4e03-93fa-4c60d651e045: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 5203935e-446c-4e03-93fa-4c60d651e045 could not be found.
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.646 183087 INFO nova.compute.resource_tracker [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: 5203935e-446c-4e03-93fa-4c60d651e045
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.864 183087 DEBUG nova.compute.resource_tracker [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:36:52 compute-1 nova_compute[183083]: 2026-01-26 08:36:52.864 183087 DEBUG nova.compute.resource_tracker [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:36:53 compute-1 nova_compute[183083]: 2026-01-26 08:36:53.926 183087 INFO nova.scheduler.client.report [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [req-3c85ec0b-6491-41e9-8309-c94935ae77f1] Created resource provider record via placement API for resource provider with UUID 5203935e-446c-4e03-93fa-4c60d651e045 and name compute-1.ctlplane.example.com.
Jan 26 08:36:53 compute-1 nova_compute[183083]: 2026-01-26 08:36:53.997 183087 DEBUG nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 26 08:36:53 compute-1 nova_compute[183083]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Jan 26 08:36:53 compute-1 nova_compute[183083]: 2026-01-26 08:36:53.997 183087 INFO nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] kernel doesn't support AMD SEV
Jan 26 08:36:53 compute-1 nova_compute[183083]: 2026-01-26 08:36:53.999 183087 DEBUG nova.compute.provider_tree [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Updating inventory in ProviderTree for provider 5203935e-446c-4e03-93fa-4c60d651e045 with inventory: {'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 08:36:54 compute-1 nova_compute[183083]: 2026-01-26 08:36:53.999 183087 DEBUG nova.virt.libvirt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 08:36:54 compute-1 nova_compute[183083]: 2026-01-26 08:36:54.075 183087 DEBUG nova.scheduler.client.report [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Updated inventory for provider 5203935e-446c-4e03-93fa-4c60d651e045 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 26 08:36:54 compute-1 nova_compute[183083]: 2026-01-26 08:36:54.076 183087 DEBUG nova.compute.provider_tree [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Updating resource provider 5203935e-446c-4e03-93fa-4c60d651e045 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 26 08:36:54 compute-1 nova_compute[183083]: 2026-01-26 08:36:54.076 183087 DEBUG nova.compute.provider_tree [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Updating inventory in ProviderTree for provider 5203935e-446c-4e03-93fa-4c60d651e045 with inventory: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 08:36:54 compute-1 nova_compute[183083]: 2026-01-26 08:36:54.231 183087 DEBUG nova.compute.provider_tree [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Updating resource provider 5203935e-446c-4e03-93fa-4c60d651e045 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 26 08:36:54 compute-1 nova_compute[183083]: 2026-01-26 08:36:54.301 183087 DEBUG nova.compute.resource_tracker [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:36:54 compute-1 nova_compute[183083]: 2026-01-26 08:36:54.302 183087 DEBUG oslo_concurrency.lockutils [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.717s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:36:54 compute-1 nova_compute[183083]: 2026-01-26 08:36:54.302 183087 DEBUG nova.service [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Jan 26 08:36:54 compute-1 nova_compute[183083]: 2026-01-26 08:36:54.523 183087 DEBUG nova.service [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Jan 26 08:36:54 compute-1 nova_compute[183083]: 2026-01-26 08:36:54.523 183087 DEBUG nova.servicegroup.drivers.db [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Jan 26 08:36:56 compute-1 sshd-session[183407]: Accepted publickey for zuul from 192.168.122.30 port 55050 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:36:56 compute-1 systemd-logind[788]: New session 27 of user zuul.
Jan 26 08:36:56 compute-1 systemd[1]: Started Session 27 of User zuul.
Jan 26 08:36:56 compute-1 sshd-session[183407]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:36:57 compute-1 python3.9[183560]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 08:36:58 compute-1 nova_compute[183083]: 2026-01-26 08:36:58.525 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:36:58 compute-1 nova_compute[183083]: 2026-01-26 08:36:58.557 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:36:58 compute-1 sudo[183714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doifawkoxjmezzitkzscildpbbgumyhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416618.1934955-48-181611248930307/AnsiballZ_systemd_service.py'
Jan 26 08:36:58 compute-1 sudo[183714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:36:59 compute-1 python3.9[183716]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 08:36:59 compute-1 systemd[1]: Reloading.
Jan 26 08:36:59 compute-1 systemd-rc-local-generator[183739]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:36:59 compute-1 systemd-sysv-generator[183745]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:36:59 compute-1 sudo[183714]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:00 compute-1 python3.9[183901]: ansible-ansible.builtin.service_facts Invoked
Jan 26 08:37:00 compute-1 network[183918]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 08:37:00 compute-1 network[183919]: 'network-scripts' will be removed from distribution in near future.
Jan 26 08:37:00 compute-1 network[183920]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 08:37:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:37:05.287 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:37:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:37:05.289 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:37:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:37:05.289 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:37:07 compute-1 sudo[184190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvxkrsllwszwyrsfcytultbcfnqrhosw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416626.922579-86-9947082203063/AnsiballZ_systemd_service.py'
Jan 26 08:37:07 compute-1 sudo[184190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:07 compute-1 python3.9[184192]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:37:07 compute-1 sudo[184190]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:08 compute-1 sudo[184357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlenviztcgbxygxuctivwuaymjialtvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416627.9822292-106-252617200469127/AnsiballZ_file.py'
Jan 26 08:37:08 compute-1 sudo[184357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:08 compute-1 podman[184317]: 2026-01-26 08:37:08.626976757 +0000 UTC m=+0.114526177 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:37:08 compute-1 python3.9[184363]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:37:08 compute-1 sudo[184357]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:08 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 08:37:09 compute-1 sudo[184520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwjdlqoskmscvmuvnrvipqqgsgzjatft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416629.0340378-122-6128909821355/AnsiballZ_file.py'
Jan 26 08:37:09 compute-1 sudo[184520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:09 compute-1 python3.9[184522]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:37:09 compute-1 sudo[184520]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:10 compute-1 sudo[184672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjnwupmsewivqrhcakpstregrkrgffmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416629.9928887-140-127871088356235/AnsiballZ_command.py'
Jan 26 08:37:10 compute-1 sudo[184672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:10 compute-1 python3.9[184674]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:37:10 compute-1 sudo[184672]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:11 compute-1 podman[184800]: 2026-01-26 08:37:11.532884969 +0000 UTC m=+0.079825174 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:37:11 compute-1 python3.9[184841]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 08:37:12 compute-1 sudo[184995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtsngortbtldclawzefxddzaprsbdotd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416632.0049682-176-276555830151493/AnsiballZ_systemd_service.py'
Jan 26 08:37:12 compute-1 sudo[184995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:12 compute-1 python3.9[184997]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 08:37:12 compute-1 systemd[1]: Reloading.
Jan 26 08:37:12 compute-1 systemd-rc-local-generator[185025]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:37:12 compute-1 systemd-sysv-generator[185028]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:37:13 compute-1 sudo[184995]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:13 compute-1 sudo[185183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qupgkcgextayrgvmsjfrlddesisnpuzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416633.2737632-192-232030255482049/AnsiballZ_command.py'
Jan 26 08:37:13 compute-1 sudo[185183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:13 compute-1 python3.9[185185]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:37:13 compute-1 sudo[185183]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:14 compute-1 sudo[185336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cimdfqibvvebdvcrsgrwrtgalzpsebhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416634.2414742-210-43625537787699/AnsiballZ_file.py'
Jan 26 08:37:14 compute-1 sudo[185336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:14 compute-1 python3.9[185338]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:37:14 compute-1 sudo[185336]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:15 compute-1 python3.9[185488]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:37:16 compute-1 sudo[185640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvlxmfnwdfjfvwlqcfcjrhcoacnthqjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416636.0513203-242-112154163268003/AnsiballZ_group.py'
Jan 26 08:37:16 compute-1 sudo[185640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:16 compute-1 python3.9[185642]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Jan 26 08:37:16 compute-1 sudo[185640]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:17 compute-1 sudo[185792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcqsotvqxdjjnyungualmtjshrxjevcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416637.2364256-264-93910786962408/AnsiballZ_getent.py'
Jan 26 08:37:17 compute-1 sudo[185792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:17 compute-1 python3.9[185794]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Jan 26 08:37:17 compute-1 sudo[185792]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:18 compute-1 sudo[185945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsxelxgllgvghxgizkdfczfnkagiftrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416638.1521819-280-239238436139547/AnsiballZ_group.py'
Jan 26 08:37:18 compute-1 sudo[185945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:18 compute-1 python3.9[185947]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 08:37:18 compute-1 groupadd[185948]: group added to /etc/group: name=ceilometer, GID=42405
Jan 26 08:37:18 compute-1 groupadd[185948]: group added to /etc/gshadow: name=ceilometer
Jan 26 08:37:18 compute-1 groupadd[185948]: new group: name=ceilometer, GID=42405
Jan 26 08:37:18 compute-1 sudo[185945]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:19 compute-1 sudo[186103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrorbrjmsdvxdsblrqbimpbhnthlavrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416639.0476627-296-277599321063720/AnsiballZ_user.py'
Jan 26 08:37:19 compute-1 sudo[186103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:19 compute-1 python3.9[186105]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 08:37:19 compute-1 useradd[186107]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Jan 26 08:37:19 compute-1 useradd[186107]: add 'ceilometer' to group 'libvirt'
Jan 26 08:37:19 compute-1 useradd[186107]: add 'ceilometer' to shadow group 'libvirt'
Jan 26 08:37:20 compute-1 sudo[186103]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:21 compute-1 python3.9[186263]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:37:22 compute-1 python3.9[186384]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769416640.8830996-348-256704311623928/.source.conf _original_basename=ceilometer.conf follow=False checksum=806b21daa538a66a80669be8bf74c414d178dfbc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:37:23 compute-1 python3.9[186534]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:37:23 compute-1 python3.9[186655]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769416642.6122332-348-242422513372718/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:37:24 compute-1 python3.9[186805]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:37:25 compute-1 python3.9[186926]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769416643.9512029-348-118293350651416/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:37:26 compute-1 python3.9[187076]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:37:27 compute-1 python3.9[187230]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:37:28 compute-1 python3.9[187382]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:37:28 compute-1 sshd-session[187128]: Connection closed by authenticating user root 159.223.236.81 port 48668 [preauth]
Jan 26 08:37:28 compute-1 python3.9[187503]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416647.4437234-467-107296566500141/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:37:29 compute-1 python3.9[187653]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:37:30 compute-1 python3.9[187774]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416648.9211159-467-193735870067425/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:37:31 compute-1 python3.9[187924]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:37:31 compute-1 python3.9[188045]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416650.5730581-524-204325656736598/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:37:32 compute-1 python3.9[188195]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:37:33 compute-1 python3.9[188316]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416652.1361415-556-183722210261492/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:37:33 compute-1 python3.9[188466]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:37:34 compute-1 python3.9[188587]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416653.4356608-586-55301102579865/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:37:35 compute-1 python3.9[188737]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:37:35 compute-1 python3.9[188858]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416654.8213983-616-130646509699007/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:37:36 compute-1 sudo[189008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccwcdzrxpoupcxyjxnbojnwtupxgkyrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416656.2236466-646-14098381491707/AnsiballZ_file.py'
Jan 26 08:37:36 compute-1 sudo[189008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:36 compute-1 python3.9[189010]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:37:36 compute-1 sudo[189008]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:37 compute-1 sudo[189160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yracqttnqsdcladynarvhfpxvbmgyiki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416657.0546012-662-65132282040182/AnsiballZ_file.py'
Jan 26 08:37:37 compute-1 sudo[189160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:37 compute-1 python3.9[189162]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:37:37 compute-1 sudo[189160]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:38 compute-1 python3.9[189312]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:37:38 compute-1 podman[189359]: 2026-01-26 08:37:38.928811749 +0000 UTC m=+0.177476095 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 08:37:39 compute-1 python3.9[189490]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:37:40 compute-1 python3.9[189642]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:37:40 compute-1 sudo[189794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egyfgdbepscvkowbxcmfziekzemlscog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416660.436109-726-84385181971258/AnsiballZ_file.py'
Jan 26 08:37:40 compute-1 sudo[189794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:41 compute-1 python3.9[189796]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:37:41 compute-1 sudo[189794]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:41 compute-1 sudo[189959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsnualyfvxrmbqhidmoshwdzibfogbbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416661.3161938-742-6735527991982/AnsiballZ_systemd_service.py'
Jan 26 08:37:41 compute-1 sudo[189959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:41 compute-1 podman[189920]: 2026-01-26 08:37:41.762407466 +0000 UTC m=+0.109909306 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 08:37:42 compute-1 python3.9[189967]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:37:42 compute-1 systemd[1]: Reloading.
Jan 26 08:37:42 compute-1 systemd-rc-local-generator[189997]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:37:42 compute-1 systemd-sysv-generator[190000]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:37:42 compute-1 systemd[1]: Listening on Podman API Socket.
Jan 26 08:37:42 compute-1 sudo[189959]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:43 compute-1 sudo[190156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onnchchukhaafogckageajyddbslfocv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416662.8564215-760-123685206612886/AnsiballZ_stat.py'
Jan 26 08:37:43 compute-1 sudo[190156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:43 compute-1 python3.9[190158]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:37:43 compute-1 sudo[190156]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:43 compute-1 sudo[190279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fntyrncvyqyqusyzmutnbfklwpxktiwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416662.8564215-760-123685206612886/AnsiballZ_copy.py'
Jan 26 08:37:43 compute-1 sudo[190279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:44 compute-1 python3.9[190281]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416662.8564215-760-123685206612886/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:37:44 compute-1 sudo[190279]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:44 compute-1 sudo[190355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhncezttrsnntngalmqwacapkbxangen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416662.8564215-760-123685206612886/AnsiballZ_stat.py'
Jan 26 08:37:44 compute-1 sudo[190355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:44 compute-1 python3.9[190357]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:37:44 compute-1 sudo[190355]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:45 compute-1 sudo[190478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oicurxcypgvcjwbsqqfonndzgeeygxtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416662.8564215-760-123685206612886/AnsiballZ_copy.py'
Jan 26 08:37:45 compute-1 sudo[190478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:45 compute-1 python3.9[190480]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416662.8564215-760-123685206612886/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:37:45 compute-1 sudo[190478]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:48 compute-1 sudo[190630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udsytkfqqtknjphlfxenemooxmexuvqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416667.85515-824-102887901551187/AnsiballZ_file.py'
Jan 26 08:37:48 compute-1 sudo[190630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:48 compute-1 python3.9[190632]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:37:48 compute-1 sudo[190630]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:49 compute-1 sudo[190782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgwzdcovijdsbxiabogugymjuogozupc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416668.6774933-840-239315048529975/AnsiballZ_file.py'
Jan 26 08:37:49 compute-1 sudo[190782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:49 compute-1 python3.9[190784]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:37:49 compute-1 sudo[190782]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:50 compute-1 sudo[190934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqzxnmqwuizqjggeiphpivkwemyhyyqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416669.558475-856-202999309179199/AnsiballZ_stat.py'
Jan 26 08:37:50 compute-1 sudo[190934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:50 compute-1 python3.9[190936]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:37:50 compute-1 sudo[190934]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:50 compute-1 sudo[191057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puzrhfjbinnfjxcbzxztwbajfeztmtrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416669.558475-856-202999309179199/AnsiballZ_copy.py'
Jan 26 08:37:50 compute-1 sudo[191057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:50 compute-1 nova_compute[183083]: 2026-01-26 08:37:50.953 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:37:50 compute-1 nova_compute[183083]: 2026-01-26 08:37:50.955 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:37:50 compute-1 nova_compute[183083]: 2026-01-26 08:37:50.956 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:37:50 compute-1 nova_compute[183083]: 2026-01-26 08:37:50.956 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:37:50 compute-1 python3.9[191059]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416669.558475-856-202999309179199/.source.json _original_basename=.ldfx4307 follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:37:50 compute-1 nova_compute[183083]: 2026-01-26 08:37:50.970 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 08:37:50 compute-1 nova_compute[183083]: 2026-01-26 08:37:50.970 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:37:50 compute-1 nova_compute[183083]: 2026-01-26 08:37:50.971 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:37:50 compute-1 nova_compute[183083]: 2026-01-26 08:37:50.972 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:37:50 compute-1 nova_compute[183083]: 2026-01-26 08:37:50.972 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:37:50 compute-1 nova_compute[183083]: 2026-01-26 08:37:50.973 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:37:50 compute-1 nova_compute[183083]: 2026-01-26 08:37:50.973 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:37:50 compute-1 nova_compute[183083]: 2026-01-26 08:37:50.973 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:37:50 compute-1 nova_compute[183083]: 2026-01-26 08:37:50.974 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:37:50 compute-1 sudo[191057]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:51 compute-1 nova_compute[183083]: 2026-01-26 08:37:51.002 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:37:51 compute-1 nova_compute[183083]: 2026-01-26 08:37:51.002 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:37:51 compute-1 nova_compute[183083]: 2026-01-26 08:37:51.003 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:37:51 compute-1 nova_compute[183083]: 2026-01-26 08:37:51.003 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:37:51 compute-1 nova_compute[183083]: 2026-01-26 08:37:51.202 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:37:51 compute-1 nova_compute[183083]: 2026-01-26 08:37:51.203 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=14221MB free_disk=113.30184173583984GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:37:51 compute-1 nova_compute[183083]: 2026-01-26 08:37:51.203 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:37:51 compute-1 nova_compute[183083]: 2026-01-26 08:37:51.203 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:37:51 compute-1 nova_compute[183083]: 2026-01-26 08:37:51.267 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:37:51 compute-1 nova_compute[183083]: 2026-01-26 08:37:51.267 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:37:51 compute-1 nova_compute[183083]: 2026-01-26 08:37:51.291 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:37:51 compute-1 nova_compute[183083]: 2026-01-26 08:37:51.308 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:37:51 compute-1 nova_compute[183083]: 2026-01-26 08:37:51.309 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:37:51 compute-1 nova_compute[183083]: 2026-01-26 08:37:51.310 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:37:51 compute-1 python3.9[191209]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:37:54 compute-1 sudo[191630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwwqluypeqamrmsjzcxnlgdmbkbghneh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416673.6056614-936-194962835005942/AnsiballZ_container_config_data.py'
Jan 26 08:37:54 compute-1 sudo[191630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:54 compute-1 python3.9[191632]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Jan 26 08:37:54 compute-1 sudo[191630]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:55 compute-1 sudo[191782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljhkbfsatiashaqkhelwlmslfqmxotas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416674.721882-958-9233120990915/AnsiballZ_container_config_hash.py'
Jan 26 08:37:55 compute-1 sudo[191782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:55 compute-1 python3.9[191784]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 08:37:55 compute-1 sudo[191782]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:56 compute-1 sudo[191934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkktcdxvcqgxsxjqkqaktmbjphfvuuvc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769416675.817488-978-53078139245976/AnsiballZ_edpm_container_manage.py'
Jan 26 08:37:56 compute-1 sudo[191934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:56 compute-1 python3[191936]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 08:37:56 compute-1 podman[191970]: 2026-01-26 08:37:56.924734403 +0000 UTC m=+0.069521832 container create 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:37:56 compute-1 podman[191970]: 2026-01-26 08:37:56.883892417 +0000 UTC m=+0.028679866 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 26 08:37:56 compute-1 python3[191936]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8 --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Jan 26 08:37:57 compute-1 sudo[191934]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:57 compute-1 sudo[192158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfnnlrpoqkgtvazsnxsxpbwpyebswxra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416677.411155-994-106337279205333/AnsiballZ_stat.py'
Jan 26 08:37:57 compute-1 sudo[192158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:57 compute-1 python3.9[192160]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:37:58 compute-1 sudo[192158]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:58 compute-1 sudo[192312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqqamzmwkjkcxnwczovgfcmvrhrxkyce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416678.3294983-1012-132721306580930/AnsiballZ_file.py'
Jan 26 08:37:58 compute-1 sudo[192312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:58 compute-1 python3.9[192314]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:37:58 compute-1 sudo[192312]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:59 compute-1 sudo[192388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfwjjbqicnwazadsebegmlkmocdzupue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416678.3294983-1012-132721306580930/AnsiballZ_stat.py'
Jan 26 08:37:59 compute-1 sudo[192388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:37:59 compute-1 python3.9[192390]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:37:59 compute-1 sudo[192388]: pam_unix(sudo:session): session closed for user root
Jan 26 08:37:59 compute-1 sudo[192539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjypoqpcprdtktdoxtjfevziijccprvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416679.4903145-1012-255927551611965/AnsiballZ_copy.py'
Jan 26 08:37:59 compute-1 sudo[192539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:00 compute-1 python3.9[192541]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769416679.4903145-1012-255927551611965/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:38:00 compute-1 sudo[192539]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:00 compute-1 sudo[192615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdisqvjcnzsyrdklyfoxjpoavaowuzct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416679.4903145-1012-255927551611965/AnsiballZ_systemd.py'
Jan 26 08:38:00 compute-1 sudo[192615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:00 compute-1 python3.9[192617]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 08:38:00 compute-1 systemd[1]: Reloading.
Jan 26 08:38:01 compute-1 systemd-sysv-generator[192647]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:38:01 compute-1 systemd-rc-local-generator[192644]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:38:01 compute-1 sudo[192615]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:01 compute-1 sudo[192725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpzayjzlcubmbujmjmgpdusnqibvpoaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416679.4903145-1012-255927551611965/AnsiballZ_systemd.py'
Jan 26 08:38:01 compute-1 sudo[192725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:01 compute-1 python3.9[192727]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:38:02 compute-1 systemd[1]: Reloading.
Jan 26 08:38:02 compute-1 systemd-sysv-generator[192756]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:38:02 compute-1 systemd-rc-local-generator[192753]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:38:02 compute-1 systemd[1]: Starting ceilometer_agent_compute container...
Jan 26 08:38:02 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:38:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f5324227cbfbce08ef337bb242183f76452da06f30a921cbce53481259eb037e/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 26 08:38:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f5324227cbfbce08ef337bb242183f76452da06f30a921cbce53481259eb037e/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Jan 26 08:38:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f5324227cbfbce08ef337bb242183f76452da06f30a921cbce53481259eb037e/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Jan 26 08:38:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f5324227cbfbce08ef337bb242183f76452da06f30a921cbce53481259eb037e/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Jan 26 08:38:02 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67.
Jan 26 08:38:02 compute-1 podman[192768]: 2026-01-26 08:38:02.545471653 +0000 UTC m=+0.148404750 container init 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: + sudo -E kolla_set_configs
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: sudo: unable to send audit message: Operation not permitted
Jan 26 08:38:02 compute-1 sudo[192790]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Jan 26 08:38:02 compute-1 sudo[192790]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Jan 26 08:38:02 compute-1 sudo[192790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Jan 26 08:38:02 compute-1 podman[192768]: 2026-01-26 08:38:02.578947535 +0000 UTC m=+0.181880642 container start 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:38:02 compute-1 podman[192768]: ceilometer_agent_compute
Jan 26 08:38:02 compute-1 systemd[1]: Started ceilometer_agent_compute container.
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: INFO:__main__:Validating config file
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: INFO:__main__:Copying service configuration files
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: INFO:__main__:Writing out command to execute
Jan 26 08:38:02 compute-1 sudo[192790]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: ++ cat /run_command
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: + ARGS=
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: + sudo kolla_copy_cacerts
Jan 26 08:38:02 compute-1 sudo[192725]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:02 compute-1 podman[192791]: 2026-01-26 08:38:02.647227603 +0000 UTC m=+0.053711065 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Jan 26 08:38:02 compute-1 systemd[1]: 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67-e191ebfdaca0ff9.service: Main process exited, code=exited, status=1/FAILURE
Jan 26 08:38:02 compute-1 systemd[1]: 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67-e191ebfdaca0ff9.service: Failed with result 'exit-code'.
Jan 26 08:38:02 compute-1 sudo[192813]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: sudo: unable to send audit message: Operation not permitted
Jan 26 08:38:02 compute-1 sudo[192813]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Jan 26 08:38:02 compute-1 sudo[192813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Jan 26 08:38:02 compute-1 sudo[192813]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: + [[ ! -n '' ]]
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: + . kolla_extend_start
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: + umask 0022
Jan 26 08:38:02 compute-1 ceilometer_agent_compute[192784]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.481 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.482 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.482 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.482 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.482 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.482 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.482 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.482 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.482 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.483 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.483 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.483 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.483 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.483 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.483 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.483 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.483 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.483 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.484 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.484 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.484 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.484 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.484 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.484 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.484 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.484 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.484 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.484 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.485 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.485 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.485 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.485 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.485 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.485 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.485 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.485 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.485 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.485 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.485 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.486 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.486 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.486 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.486 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.486 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.486 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.486 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.486 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.486 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.486 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.486 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.486 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.487 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.487 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.487 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.487 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.487 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.487 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.487 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.487 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.487 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.487 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.487 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.488 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.488 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.488 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.488 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.488 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.488 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.488 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.488 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.488 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.489 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.489 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.489 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.489 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.489 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.489 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.489 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.489 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.489 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.489 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.490 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.490 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.490 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.490 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.490 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.490 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.490 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.490 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.490 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.491 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.491 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.491 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.491 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.491 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.491 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.491 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.491 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.491 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.491 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.491 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.492 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.492 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.492 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.492 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.492 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.492 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.492 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.492 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.492 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.492 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.493 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.493 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.493 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.493 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.493 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.493 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.493 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.493 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.493 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.493 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.494 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.494 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.494 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.494 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.494 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.494 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.494 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.494 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.494 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.494 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.494 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.495 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.495 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.495 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.495 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.495 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.495 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.495 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.495 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.495 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.495 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.496 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.496 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.496 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.496 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.496 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.496 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.496 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.496 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.496 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.496 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.496 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.497 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.497 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.497 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.497 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.497 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.497 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.497 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.497 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.497 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.497 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.497 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.498 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.519 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.521 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.522 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Jan 26 08:38:03 compute-1 python3.9[192965]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.630 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.704 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.704 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.704 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.704 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.705 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.705 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.705 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.705 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.705 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.705 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.705 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.705 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.705 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.705 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.706 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.706 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.706 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.706 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.706 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.706 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.706 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.706 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.706 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.706 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.707 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.707 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.707 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.707 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.707 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.707 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.707 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.707 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.707 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.707 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.707 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.707 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.707 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.708 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.708 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.708 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.708 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.708 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.708 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.708 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.708 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.708 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.708 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.708 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.709 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.709 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.709 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.709 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.709 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.709 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.709 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.709 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.709 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.709 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.709 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.710 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.710 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.710 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.710 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.710 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.710 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.710 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.710 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.710 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.710 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.710 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.711 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.711 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.711 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.711 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.711 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.711 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.711 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.711 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.711 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.711 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.711 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.711 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.712 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.712 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.712 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.712 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.712 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.712 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.712 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.712 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.712 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.712 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.712 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.713 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.713 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.713 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.713 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.713 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.713 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.713 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.713 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.713 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.713 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.713 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.714 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.714 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.714 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.714 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.714 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.714 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.714 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.714 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.714 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.714 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.715 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.715 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.715 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.715 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.715 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.715 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.715 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.715 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.715 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.715 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.715 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.715 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.716 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.716 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.716 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.716 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.716 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.716 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.716 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.716 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.716 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.716 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.716 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.717 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.717 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.717 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.717 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.717 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.717 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.717 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.717 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.717 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.717 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.717 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.717 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.718 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.718 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.718 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.718 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.718 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.718 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.718 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.718 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.718 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.718 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.718 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.718 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.718 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.719 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.719 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.719 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.719 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.719 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.719 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.719 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.719 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.719 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.719 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.719 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.720 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.720 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.720 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.720 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.720 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.720 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.720 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.720 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.720 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.720 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.720 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.720 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.721 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.721 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.721 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.721 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.721 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.721 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.721 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.721 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.721 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.721 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.721 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.722 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.722 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.722 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.722 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.722 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.722 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.722 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.722 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.722 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.722 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.723 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.723 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.723 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.723 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.723 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.723 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.723 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.723 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.723 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.723 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.723 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.723 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.723 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.727 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.736 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.745 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.745 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.745 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.745 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:38:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:38:04 compute-1 sudo[193121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtmadsltpthonpriebevatdrfinugfbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416684.204585-1102-209835881675115/AnsiballZ_stat.py'
Jan 26 08:38:04 compute-1 sudo[193121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:04 compute-1 python3.9[193123]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:38:04 compute-1 sudo[193121]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:05 compute-1 sudo[193246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmbcmfyyzwvffxmwvomejnlkmvbjnzbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416684.204585-1102-209835881675115/AnsiballZ_copy.py'
Jan 26 08:38:05 compute-1 sudo[193246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:38:05.289 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:38:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:38:05.291 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:38:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:38:05.291 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:38:05 compute-1 python3.9[193248]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416684.204585-1102-209835881675115/.source.yaml _original_basename=.82ffisr9 follow=False checksum=9d20f78fa248fa64eed2a627e2e5d1605d9af449 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:38:05 compute-1 sudo[193246]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:06 compute-1 sudo[193398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyftcyfcfzkswsihcvtqcrwgxnzdammu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416685.6796134-1132-98649978683550/AnsiballZ_stat.py'
Jan 26 08:38:06 compute-1 sudo[193398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:06 compute-1 python3.9[193400]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:38:06 compute-1 sudo[193398]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:06 compute-1 sudo[193521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tszvxaxznwmtgnbzvswueddmdgvlaigl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416685.6796134-1132-98649978683550/AnsiballZ_copy.py'
Jan 26 08:38:06 compute-1 sudo[193521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:06 compute-1 python3.9[193523]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416685.6796134-1132-98649978683550/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:38:06 compute-1 sudo[193521]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:07 compute-1 sudo[193673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iybgvebcpidsspvijyozxqmzmncrrtls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416687.4299903-1174-127527137455423/AnsiballZ_file.py'
Jan 26 08:38:07 compute-1 sudo[193673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:08 compute-1 python3.9[193675]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:38:08 compute-1 sudo[193673]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:08 compute-1 sudo[193825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zudbpovgyvfvntimbzdmoftlolibegeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416688.2354162-1190-265510496789651/AnsiballZ_file.py'
Jan 26 08:38:08 compute-1 sudo[193825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:08 compute-1 python3.9[193827]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:38:08 compute-1 sudo[193825]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:09 compute-1 sudo[193990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrevginkiltspzvuhffzbzpsgizyrwzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416689.0986886-1206-73088035336241/AnsiballZ_stat.py'
Jan 26 08:38:09 compute-1 sudo[193990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:09 compute-1 podman[193951]: 2026-01-26 08:38:09.513128182 +0000 UTC m=+0.119458588 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:38:09 compute-1 python3.9[193998]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:38:09 compute-1 sudo[193990]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:09 compute-1 sudo[194081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enfncslbvixzhstawobxpxlbbmelbvuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416689.0986886-1206-73088035336241/AnsiballZ_file.py'
Jan 26 08:38:09 compute-1 sudo[194081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:10 compute-1 python3.9[194083]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=._c9ru9q7 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:38:10 compute-1 sudo[194081]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:10 compute-1 python3.9[194233]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:38:12 compute-1 podman[194478]: 2026-01-26 08:38:12.187203688 +0000 UTC m=+0.109858582 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 26 08:38:13 compute-1 sudo[194673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thrgitakexrtxkasklbsagkogyxegphx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416692.7595582-1280-133036454784601/AnsiballZ_container_config_data.py'
Jan 26 08:38:13 compute-1 sudo[194673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:13 compute-1 python3.9[194675]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Jan 26 08:38:13 compute-1 sudo[194673]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:14 compute-1 sudo[194825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpduzxnlpkgafsajnvodgssrfgnddcmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416693.7976024-1302-63069574627538/AnsiballZ_container_config_hash.py'
Jan 26 08:38:14 compute-1 sudo[194825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:14 compute-1 python3.9[194827]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 08:38:14 compute-1 sudo[194825]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:15 compute-1 sudo[194977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzgjhtnplsfwfwyzzguwvaorxcnlmfec ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769416694.7207758-1322-219795753734124/AnsiballZ_edpm_container_manage.py'
Jan 26 08:38:15 compute-1 sudo[194977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:15 compute-1 python3[194979]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 08:38:15 compute-1 podman[195017]: 2026-01-26 08:38:15.683854382 +0000 UTC m=+0.076744353 container create 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=node_exporter, container_name=node_exporter)
Jan 26 08:38:15 compute-1 podman[195017]: 2026-01-26 08:38:15.647853546 +0000 UTC m=+0.040743567 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 26 08:38:15 compute-1 python3[194979]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8 --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Jan 26 08:38:15 compute-1 sudo[194977]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:16 compute-1 sudo[195206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jenuaqgwykeuwignpypmjnsazfskdady ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416696.1258419-1338-86372643120801/AnsiballZ_stat.py'
Jan 26 08:38:16 compute-1 sudo[195206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:16 compute-1 python3.9[195208]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:38:16 compute-1 sudo[195206]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:17 compute-1 sudo[195360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkgdcqqqzdkmnmsgyxwxbmlcwupahsoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416697.0490031-1356-177552640745811/AnsiballZ_file.py'
Jan 26 08:38:17 compute-1 sudo[195360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:17 compute-1 python3.9[195362]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:38:17 compute-1 sudo[195360]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:17 compute-1 sudo[195436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfmcslptandqyomxbzwbalszryicwpmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416697.0490031-1356-177552640745811/AnsiballZ_stat.py'
Jan 26 08:38:17 compute-1 sudo[195436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:18 compute-1 python3.9[195438]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:38:18 compute-1 sudo[195436]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:18 compute-1 sudo[195587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljiinpoaygtkjsactvkrbhfcnakdjxqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416698.1461232-1356-195333660706853/AnsiballZ_copy.py'
Jan 26 08:38:18 compute-1 sudo[195587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:18 compute-1 python3.9[195589]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769416698.1461232-1356-195333660706853/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:38:18 compute-1 sudo[195587]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:19 compute-1 sudo[195663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghtnrcyzhcwvzunovmiwjrduudzbtcpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416698.1461232-1356-195333660706853/AnsiballZ_systemd.py'
Jan 26 08:38:19 compute-1 sudo[195663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:19 compute-1 python3.9[195665]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 08:38:19 compute-1 systemd[1]: Reloading.
Jan 26 08:38:19 compute-1 systemd-rc-local-generator[195692]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:38:19 compute-1 systemd-sysv-generator[195698]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:38:19 compute-1 sudo[195663]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:20 compute-1 sudo[195774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdsafimmcrbgxvszonducohlprlyafok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416698.1461232-1356-195333660706853/AnsiballZ_systemd.py'
Jan 26 08:38:20 compute-1 sudo[195774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:20 compute-1 python3.9[195776]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:38:20 compute-1 systemd[1]: Reloading.
Jan 26 08:38:20 compute-1 systemd-rc-local-generator[195807]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:38:20 compute-1 systemd-sysv-generator[195811]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:38:20 compute-1 systemd[1]: Starting node_exporter container...
Jan 26 08:38:21 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:38:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca76e15d9f27367dbce556785c1162a8e02abc968ea7bf08182f269a32dab1b6/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 26 08:38:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca76e15d9f27367dbce556785c1162a8e02abc968ea7bf08182f269a32dab1b6/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 26 08:38:21 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4.
Jan 26 08:38:21 compute-1 podman[195816]: 2026-01-26 08:38:21.075944687 +0000 UTC m=+0.157613834 container init 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.096Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.096Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.096Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.097Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.097Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.097Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.097Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.098Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.098Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.098Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.098Z caller=node_exporter.go:117 level=info collector=arp
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.098Z caller=node_exporter.go:117 level=info collector=bcache
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.098Z caller=node_exporter.go:117 level=info collector=bonding
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.098Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.098Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.098Z caller=node_exporter.go:117 level=info collector=cpu
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.098Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.098Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.098Z caller=node_exporter.go:117 level=info collector=edac
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.098Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.098Z caller=node_exporter.go:117 level=info collector=filefd
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.098Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.098Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=netclass
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=netdev
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=netstat
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=nfs
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=nvme
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=softnet
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=systemd
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=xfs
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=node_exporter.go:117 level=info collector=zfs
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.099Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Jan 26 08:38:21 compute-1 node_exporter[195831]: ts=2026-01-26T08:38:21.100Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Jan 26 08:38:21 compute-1 podman[195816]: 2026-01-26 08:38:21.111649978 +0000 UTC m=+0.193319085 container start 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 08:38:21 compute-1 podman[195816]: node_exporter
Jan 26 08:38:21 compute-1 systemd[1]: Started node_exporter container.
Jan 26 08:38:21 compute-1 sudo[195774]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:21 compute-1 podman[195840]: 2026-01-26 08:38:21.200065912 +0000 UTC m=+0.067870883 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 08:38:22 compute-1 python3.9[196013]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 08:38:22 compute-1 sudo[196163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwieyctvbiiotiahbonawjielwywkord ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416702.6733346-1446-229904901941675/AnsiballZ_stat.py'
Jan 26 08:38:22 compute-1 sudo[196163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:23 compute-1 python3.9[196165]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:38:23 compute-1 sudo[196163]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:23 compute-1 sudo[196288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvzdnwknhmngoyfwgejszmavtoxlfftz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416702.6733346-1446-229904901941675/AnsiballZ_copy.py'
Jan 26 08:38:23 compute-1 sudo[196288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:23 compute-1 python3.9[196290]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416702.6733346-1446-229904901941675/.source.yaml _original_basename=.6x_wf7l7 follow=False checksum=ef4262f8526abb0a59599edb5c4fc7db8625b7ce backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:38:23 compute-1 sudo[196288]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:24 compute-1 sudo[196440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eddemczqnxhkklzkhndcnspyfrafwayc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416704.192635-1476-7654695451034/AnsiballZ_stat.py'
Jan 26 08:38:24 compute-1 sudo[196440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:24 compute-1 python3.9[196442]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:38:24 compute-1 sudo[196440]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:25 compute-1 sudo[196563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vppsphugzwxfosbefjxmjjuvlhzqmwby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416704.192635-1476-7654695451034/AnsiballZ_copy.py'
Jan 26 08:38:25 compute-1 sudo[196563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:25 compute-1 python3.9[196565]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416704.192635-1476-7654695451034/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:38:25 compute-1 sudo[196563]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:26 compute-1 sudo[196715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjpqjvxtlotjzopnxapeifbtabskokut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416706.1161437-1518-202675444454634/AnsiballZ_file.py'
Jan 26 08:38:26 compute-1 sudo[196715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:26 compute-1 python3.9[196717]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:38:26 compute-1 sudo[196715]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:27 compute-1 sudo[196867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgamfnlhqlefbvqhedlbelqjdecowwuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416706.9764042-1534-280357449742853/AnsiballZ_file.py'
Jan 26 08:38:27 compute-1 sudo[196867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:27 compute-1 python3.9[196869]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:38:27 compute-1 sudo[196867]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:28 compute-1 sudo[197019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awyedwgrftfplgkzjjnvpfkapsemfmhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416707.856776-1550-81610028771528/AnsiballZ_stat.py'
Jan 26 08:38:28 compute-1 sudo[197019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:28 compute-1 python3.9[197021]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:38:28 compute-1 sudo[197019]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:28 compute-1 sudo[197097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbjyozrexglizvjkrykgtudnegzoztqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416707.856776-1550-81610028771528/AnsiballZ_file.py'
Jan 26 08:38:28 compute-1 sudo[197097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:29 compute-1 python3.9[197099]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.g04egg5t recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:38:29 compute-1 sudo[197097]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:29 compute-1 python3.9[197249]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:38:30 compute-1 sshd-session[197379]: Connection closed by authenticating user root 159.223.236.81 port 51964 [preauth]
Jan 26 08:38:32 compute-1 sudo[197672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsvttdgtagxshhzermkqxekehmmxpmff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416711.7909856-1624-78921454998469/AnsiballZ_container_config_data.py'
Jan 26 08:38:32 compute-1 sudo[197672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:32 compute-1 python3.9[197674]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Jan 26 08:38:32 compute-1 sudo[197672]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:32 compute-1 podman[197699]: 2026-01-26 08:38:32.849101381 +0000 UTC m=+0.106678601 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 08:38:32 compute-1 systemd[1]: 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67-e191ebfdaca0ff9.service: Main process exited, code=exited, status=1/FAILURE
Jan 26 08:38:32 compute-1 systemd[1]: 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67-e191ebfdaca0ff9.service: Failed with result 'exit-code'.
Jan 26 08:38:33 compute-1 sudo[197843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llcynunsvlgiypafioahqxrcpyjnauin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416712.8256793-1646-184864295026775/AnsiballZ_container_config_hash.py'
Jan 26 08:38:33 compute-1 sudo[197843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:33 compute-1 python3.9[197845]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 08:38:33 compute-1 sudo[197843]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:34 compute-1 sudo[197995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awzmpxjjcyginxdqsqlasyspkiyvigdk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769416713.7681224-1666-16191361866743/AnsiballZ_edpm_container_manage.py'
Jan 26 08:38:34 compute-1 sudo[197995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:34 compute-1 python3[197997]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 08:38:35 compute-1 podman[198009]: 2026-01-26 08:38:35.785955105 +0000 UTC m=+1.273270602 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 26 08:38:35 compute-1 podman[198106]: 2026-01-26 08:38:35.943897944 +0000 UTC m=+0.063175984 container create d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 08:38:35 compute-1 podman[198106]: 2026-01-26 08:38:35.90358134 +0000 UTC m=+0.022859460 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 26 08:38:35 compute-1 python3[197997]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Jan 26 08:38:36 compute-1 sudo[197995]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:36 compute-1 sudo[198293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfbbbuybwsrjiirixieatzfhigbkbcpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416716.3745503-1682-169268965609339/AnsiballZ_stat.py'
Jan 26 08:38:36 compute-1 sudo[198293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:36 compute-1 python3.9[198295]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:38:36 compute-1 sudo[198293]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:37 compute-1 sudo[198447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecppdueqwettelzsqzhbgwshaasxvwqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416717.2818167-1700-110788491805411/AnsiballZ_file.py'
Jan 26 08:38:37 compute-1 sudo[198447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:37 compute-1 python3.9[198449]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:38:37 compute-1 sudo[198447]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:38 compute-1 sudo[198523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qllqmnqkjasurakuypdiksdwwmcekebh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416717.2818167-1700-110788491805411/AnsiballZ_stat.py'
Jan 26 08:38:38 compute-1 sudo[198523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:38 compute-1 python3.9[198525]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:38:38 compute-1 sudo[198523]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:38 compute-1 sudo[198674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcdidibbcyifterlvswdencnnkhgocwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416718.3670938-1700-79475532107268/AnsiballZ_copy.py'
Jan 26 08:38:38 compute-1 sudo[198674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:39 compute-1 python3.9[198676]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769416718.3670938-1700-79475532107268/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:38:39 compute-1 sudo[198674]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:39 compute-1 sudo[198750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxkzajjuohgjqwrllijmfyooxxhieett ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416718.3670938-1700-79475532107268/AnsiballZ_systemd.py'
Jan 26 08:38:39 compute-1 sudo[198750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:39 compute-1 python3.9[198752]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 08:38:39 compute-1 systemd[1]: Reloading.
Jan 26 08:38:39 compute-1 systemd-rc-local-generator[198795]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:38:39 compute-1 systemd-sysv-generator[198798]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:38:39 compute-1 podman[198754]: 2026-01-26 08:38:39.851924609 +0000 UTC m=+0.127076955 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 26 08:38:40 compute-1 sudo[198750]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:40 compute-1 sudo[198888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbywfkirkmagsvzndtsmqhewhsgoysws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416718.3670938-1700-79475532107268/AnsiballZ_systemd.py'
Jan 26 08:38:40 compute-1 sudo[198888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:40 compute-1 python3.9[198890]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:38:40 compute-1 systemd[1]: Reloading.
Jan 26 08:38:40 compute-1 systemd-rc-local-generator[198919]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:38:40 compute-1 systemd-sysv-generator[198923]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:38:41 compute-1 systemd[1]: Starting podman_exporter container...
Jan 26 08:38:41 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:38:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5eb951e6a295ecc17d0e01d8097b594d80c3219785c15f3ecf90112a0addbf6d/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 26 08:38:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5eb951e6a295ecc17d0e01d8097b594d80c3219785c15f3ecf90112a0addbf6d/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 26 08:38:41 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364.
Jan 26 08:38:41 compute-1 podman[198930]: 2026-01-26 08:38:41.211052993 +0000 UTC m=+0.136330075 container init d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 08:38:41 compute-1 podman_exporter[198945]: ts=2026-01-26T08:38:41.228Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Jan 26 08:38:41 compute-1 podman_exporter[198945]: ts=2026-01-26T08:38:41.228Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Jan 26 08:38:41 compute-1 podman_exporter[198945]: ts=2026-01-26T08:38:41.228Z caller=handler.go:94 level=info msg="enabled collectors"
Jan 26 08:38:41 compute-1 podman_exporter[198945]: ts=2026-01-26T08:38:41.228Z caller=handler.go:105 level=info collector=container
Jan 26 08:38:41 compute-1 podman[198930]: 2026-01-26 08:38:41.245747812 +0000 UTC m=+0.171024864 container start d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 08:38:41 compute-1 podman[198930]: podman_exporter
Jan 26 08:38:41 compute-1 systemd[1]: Starting Podman API Service...
Jan 26 08:38:41 compute-1 systemd[1]: Started podman_exporter container.
Jan 26 08:38:41 compute-1 systemd[1]: Started Podman API Service.
Jan 26 08:38:41 compute-1 podman[198956]: time="2026-01-26T08:38:41Z" level=info msg="/usr/bin/podman filtering at log level info"
Jan 26 08:38:41 compute-1 podman[198956]: time="2026-01-26T08:38:41Z" level=info msg="Setting parallel job count to 25"
Jan 26 08:38:41 compute-1 podman[198956]: time="2026-01-26T08:38:41Z" level=info msg="Using sqlite as database backend"
Jan 26 08:38:41 compute-1 podman[198956]: time="2026-01-26T08:38:41Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Jan 26 08:38:41 compute-1 podman[198956]: time="2026-01-26T08:38:41Z" level=info msg="Using systemd socket activation to determine API endpoint"
Jan 26 08:38:41 compute-1 podman[198956]: time="2026-01-26T08:38:41Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Jan 26 08:38:41 compute-1 podman[198956]: @ - - [26/Jan/2026:08:38:41 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Jan 26 08:38:41 compute-1 podman[198956]: time="2026-01-26T08:38:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 08:38:41 compute-1 sudo[198888]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:41 compute-1 podman[198956]: @ - - [26/Jan/2026:08:38:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 18073 "" "Go-http-client/1.1"
Jan 26 08:38:41 compute-1 podman_exporter[198945]: ts=2026-01-26T08:38:41.351Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Jan 26 08:38:41 compute-1 podman_exporter[198945]: ts=2026-01-26T08:38:41.352Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Jan 26 08:38:41 compute-1 podman_exporter[198945]: ts=2026-01-26T08:38:41.353Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Jan 26 08:38:41 compute-1 podman[198955]: 2026-01-26 08:38:41.359984916 +0000 UTC m=+0.092856580 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 08:38:42 compute-1 python3.9[199139]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 08:38:42 compute-1 podman[199170]: 2026-01-26 08:38:42.869494385 +0000 UTC m=+0.125226589 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 08:38:43 compute-1 sudo[199307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfsefuepfcvobnayuklwpcwfdwfmnxfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416722.7523258-1790-94515734489517/AnsiballZ_stat.py'
Jan 26 08:38:43 compute-1 sudo[199307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:43 compute-1 python3.9[199309]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:38:43 compute-1 sudo[199307]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:43 compute-1 sudo[199432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feobxtfqqmqulbfaxlnjatdlttqlzfvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416722.7523258-1790-94515734489517/AnsiballZ_copy.py'
Jan 26 08:38:43 compute-1 sudo[199432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:43 compute-1 python3.9[199434]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416722.7523258-1790-94515734489517/.source.yaml _original_basename=.8bai2zpr follow=False checksum=681acba2f40e39935fadf944ca9b45ff68d44028 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:38:43 compute-1 sudo[199432]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:44 compute-1 sudo[199584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpkrhrbdzyutdkoptnvbuirydqjmcxds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416724.3938048-1820-108956073636419/AnsiballZ_stat.py'
Jan 26 08:38:44 compute-1 sudo[199584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:44 compute-1 python3.9[199586]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:38:45 compute-1 sudo[199584]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:45 compute-1 sudo[199707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adufqwbereckrlhafxmsqgkdteothtly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416724.3938048-1820-108956073636419/AnsiballZ_copy.py'
Jan 26 08:38:45 compute-1 sudo[199707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:45 compute-1 python3.9[199709]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769416724.3938048-1820-108956073636419/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:38:45 compute-1 sudo[199707]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:48 compute-1 sudo[199859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzoyumvyghzqeoptywddxvoukihqpmwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416728.3361847-1862-11576156665877/AnsiballZ_file.py'
Jan 26 08:38:48 compute-1 sudo[199859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:48 compute-1 python3.9[199861]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:38:48 compute-1 sudo[199859]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:49 compute-1 sudo[200011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwnraehqqdslisvydwcjqtdyhodovcgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416729.1772377-1878-54364754796696/AnsiballZ_file.py'
Jan 26 08:38:49 compute-1 sudo[200011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:49 compute-1 python3.9[200013]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 08:38:49 compute-1 sudo[200011]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:50 compute-1 sudo[200163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjlmbvzzxfopuefrcnqayaipxrknencu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416729.9960306-1894-177299770292338/AnsiballZ_stat.py'
Jan 26 08:38:50 compute-1 sudo[200163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:50 compute-1 python3.9[200165]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:38:50 compute-1 sudo[200163]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:50 compute-1 sudo[200241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijnbatenenwqjzvxyljymjlkqchxjtpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416729.9960306-1894-177299770292338/AnsiballZ_file.py'
Jan 26 08:38:50 compute-1 sudo[200241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:51 compute-1 python3.9[200243]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.97xaem0f recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:38:51 compute-1 sudo[200241]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:51 compute-1 nova_compute[183083]: 2026-01-26 08:38:51.301 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:38:51 compute-1 nova_compute[183083]: 2026-01-26 08:38:51.302 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:38:51 compute-1 nova_compute[183083]: 2026-01-26 08:38:51.324 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:38:51 compute-1 nova_compute[183083]: 2026-01-26 08:38:51.325 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:38:51 compute-1 nova_compute[183083]: 2026-01-26 08:38:51.325 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:38:51 compute-1 nova_compute[183083]: 2026-01-26 08:38:51.340 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 08:38:51 compute-1 nova_compute[183083]: 2026-01-26 08:38:51.341 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:38:51 compute-1 nova_compute[183083]: 2026-01-26 08:38:51.341 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:38:51 compute-1 podman[200367]: 2026-01-26 08:38:51.818107504 +0000 UTC m=+0.089744790 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 08:38:51 compute-1 nova_compute[183083]: 2026-01-26 08:38:51.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:38:51 compute-1 nova_compute[183083]: 2026-01-26 08:38:51.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:38:51 compute-1 nova_compute[183083]: 2026-01-26 08:38:51.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:38:51 compute-1 nova_compute[183083]: 2026-01-26 08:38:51.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:38:51 compute-1 nova_compute[183083]: 2026-01-26 08:38:51.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:38:51 compute-1 python3.9[200411]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:38:51 compute-1 nova_compute[183083]: 2026-01-26 08:38:51.989 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:38:51 compute-1 nova_compute[183083]: 2026-01-26 08:38:51.989 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:38:51 compute-1 nova_compute[183083]: 2026-01-26 08:38:51.990 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:38:51 compute-1 nova_compute[183083]: 2026-01-26 08:38:51.990 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:38:52 compute-1 nova_compute[183083]: 2026-01-26 08:38:52.193 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:38:52 compute-1 nova_compute[183083]: 2026-01-26 08:38:52.194 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=14021MB free_disk=113.25496292114258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:38:52 compute-1 nova_compute[183083]: 2026-01-26 08:38:52.194 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:38:52 compute-1 nova_compute[183083]: 2026-01-26 08:38:52.194 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:38:52 compute-1 nova_compute[183083]: 2026-01-26 08:38:52.281 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:38:52 compute-1 nova_compute[183083]: 2026-01-26 08:38:52.282 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:38:52 compute-1 nova_compute[183083]: 2026-01-26 08:38:52.309 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:38:52 compute-1 nova_compute[183083]: 2026-01-26 08:38:52.320 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:38:52 compute-1 nova_compute[183083]: 2026-01-26 08:38:52.322 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:38:52 compute-1 nova_compute[183083]: 2026-01-26 08:38:52.322 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:38:53 compute-1 nova_compute[183083]: 2026-01-26 08:38:53.321 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:38:54 compute-1 sudo[200839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbwtsnkeuromrsxcytosjlqgngvyvxiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416733.8485622-1968-267124114991299/AnsiballZ_container_config_data.py'
Jan 26 08:38:54 compute-1 sudo[200839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:54 compute-1 python3.9[200841]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Jan 26 08:38:54 compute-1 sudo[200839]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:55 compute-1 sudo[200991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmzdbyhjclehtertzpdaspkpymeqdkcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416735.0317633-1990-108405854383984/AnsiballZ_container_config_hash.py'
Jan 26 08:38:55 compute-1 sudo[200991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:55 compute-1 python3.9[200993]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 08:38:55 compute-1 sudo[200991]: pam_unix(sudo:session): session closed for user root
Jan 26 08:38:56 compute-1 sudo[201143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urvuxhvncjsddaujoovgbprnxbfibxxz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769416736.0195441-2010-150169581187000/AnsiballZ_edpm_container_manage.py'
Jan 26 08:38:56 compute-1 sudo[201143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:38:56 compute-1 python3[201145]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 08:38:59 compute-1 podman[201159]: 2026-01-26 08:38:59.556227922 +0000 UTC m=+2.829858469 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 26 08:38:59 compute-1 podman[201255]: 2026-01-26 08:38:59.728405723 +0000 UTC m=+0.066269978 container create a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, version=9.6, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Jan 26 08:38:59 compute-1 podman[201255]: 2026-01-26 08:38:59.691517154 +0000 UTC m=+0.029381489 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 26 08:38:59 compute-1 python3[201145]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 26 08:38:59 compute-1 sudo[201143]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:00 compute-1 sudo[201443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qffiezzoufometubbiceoxxmqnvjnsbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416740.1561155-2026-213266071241573/AnsiballZ_stat.py'
Jan 26 08:39:00 compute-1 sudo[201443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:00 compute-1 python3.9[201445]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:39:00 compute-1 sudo[201443]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:01 compute-1 sudo[201597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oahylnfnlamfsuqjkszwgtmjsupdapyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416741.0087178-2044-145325493640370/AnsiballZ_file.py'
Jan 26 08:39:01 compute-1 sudo[201597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:01 compute-1 python3.9[201599]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:39:01 compute-1 sudo[201597]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:01 compute-1 sudo[201673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwnbbgljtgdqjqyhxocqluoxntvqvldp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416741.0087178-2044-145325493640370/AnsiballZ_stat.py'
Jan 26 08:39:01 compute-1 sudo[201673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:02 compute-1 python3.9[201675]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:39:02 compute-1 sudo[201673]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:02 compute-1 sudo[201826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phrwasjuzhlnysiarmwiiflktxnfcxkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416742.0804627-2044-132958465848234/AnsiballZ_copy.py'
Jan 26 08:39:02 compute-1 sudo[201826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:02 compute-1 python3.9[201828]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769416742.0804627-2044-132958465848234/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:39:02 compute-1 sudo[201826]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:03 compute-1 sudo[201915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhpwtyigrxvigjcxderikwftwsmtjrjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416742.0804627-2044-132958465848234/AnsiballZ_systemd.py'
Jan 26 08:39:03 compute-1 sudo[201915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:03 compute-1 podman[201876]: 2026-01-26 08:39:03.068833227 +0000 UTC m=+0.074508451 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 26 08:39:03 compute-1 systemd[1]: 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67-e191ebfdaca0ff9.service: Main process exited, code=exited, status=1/FAILURE
Jan 26 08:39:03 compute-1 systemd[1]: 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67-e191ebfdaca0ff9.service: Failed with result 'exit-code'.
Jan 26 08:39:03 compute-1 python3.9[201923]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 08:39:03 compute-1 systemd[1]: Reloading.
Jan 26 08:39:03 compute-1 systemd-rc-local-generator[201944]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:39:03 compute-1 systemd-sysv-generator[201948]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:39:03 compute-1 sudo[201915]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:03 compute-1 auditd[705]: Audit daemon rotating log files
Jan 26 08:39:04 compute-1 sudo[202032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypqjlfhjhfdllwgtidgxvszhstldymgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416742.0804627-2044-132958465848234/AnsiballZ_systemd.py'
Jan 26 08:39:04 compute-1 sudo[202032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:04 compute-1 python3.9[202034]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 08:39:04 compute-1 systemd[1]: Reloading.
Jan 26 08:39:04 compute-1 systemd-sysv-generator[202067]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 08:39:04 compute-1 systemd-rc-local-generator[202064]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 08:39:04 compute-1 sshd-session[201781]: Invalid user admin from 103.236.95.173 port 33334
Jan 26 08:39:04 compute-1 systemd[1]: Starting openstack_network_exporter container...
Jan 26 08:39:04 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:39:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23e34df86d41a4600559e6ff5114c4f962aa279f1fa4d96e1adf34135b98ba25/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 26 08:39:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23e34df86d41a4600559e6ff5114c4f962aa279f1fa4d96e1adf34135b98ba25/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 26 08:39:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23e34df86d41a4600559e6ff5114c4f962aa279f1fa4d96e1adf34135b98ba25/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 26 08:39:05 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e.
Jan 26 08:39:05 compute-1 podman[202074]: 2026-01-26 08:39:05.02086615 +0000 UTC m=+0.184378966 container init a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350)
Jan 26 08:39:05 compute-1 openstack_network_exporter[202089]: INFO    08:39:05 main.go:48: registering *bridge.Collector
Jan 26 08:39:05 compute-1 openstack_network_exporter[202089]: INFO    08:39:05 main.go:48: registering *coverage.Collector
Jan 26 08:39:05 compute-1 openstack_network_exporter[202089]: INFO    08:39:05 main.go:48: registering *datapath.Collector
Jan 26 08:39:05 compute-1 openstack_network_exporter[202089]: INFO    08:39:05 main.go:48: registering *iface.Collector
Jan 26 08:39:05 compute-1 openstack_network_exporter[202089]: INFO    08:39:05 main.go:48: registering *memory.Collector
Jan 26 08:39:05 compute-1 openstack_network_exporter[202089]: INFO    08:39:05 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Jan 26 08:39:05 compute-1 openstack_network_exporter[202089]: INFO    08:39:05 main.go:48: registering *ovn.Collector
Jan 26 08:39:05 compute-1 openstack_network_exporter[202089]: INFO    08:39:05 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Jan 26 08:39:05 compute-1 openstack_network_exporter[202089]: INFO    08:39:05 main.go:48: registering *pmd_perf.Collector
Jan 26 08:39:05 compute-1 openstack_network_exporter[202089]: INFO    08:39:05 main.go:48: registering *pmd_rxq.Collector
Jan 26 08:39:05 compute-1 openstack_network_exporter[202089]: INFO    08:39:05 main.go:48: registering *vswitch.Collector
Jan 26 08:39:05 compute-1 openstack_network_exporter[202089]: NOTICE  08:39:05 main.go:76: listening on https://:9105/metrics
Jan 26 08:39:05 compute-1 podman[202074]: 2026-01-26 08:39:05.062870234 +0000 UTC m=+0.226383040 container start a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.6, distribution-scope=public, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 26 08:39:05 compute-1 podman[202074]: openstack_network_exporter
Jan 26 08:39:05 compute-1 systemd[1]: Started openstack_network_exporter container.
Jan 26 08:39:05 compute-1 sudo[202032]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:05 compute-1 podman[202100]: 2026-01-26 08:39:05.18019661 +0000 UTC m=+0.098251270 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Jan 26 08:39:05 compute-1 sshd-session[201781]: Connection closed by invalid user admin 103.236.95.173 port 33334 [preauth]
Jan 26 08:39:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:39:05.290 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:39:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:39:05.290 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:39:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:39:05.290 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:39:06 compute-1 python3.9[202273]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 08:39:07 compute-1 rsyslogd[1006]: imjournal: 1893 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 26 08:39:07 compute-1 sudo[202423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnnxkxeojisunrtaefdwaivchbmllahl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416747.5051095-2134-172875754280460/AnsiballZ_stat.py'
Jan 26 08:39:07 compute-1 sudo[202423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:08 compute-1 python3.9[202425]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:39:08 compute-1 sudo[202423]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:08 compute-1 sudo[202548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygzhtqegjqmrdrszviltmngvzrtkacsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416747.5051095-2134-172875754280460/AnsiballZ_copy.py'
Jan 26 08:39:08 compute-1 sudo[202548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:08 compute-1 python3.9[202550]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416747.5051095-2134-172875754280460/.source.yaml _original_basename=.vd_4dw8w follow=False checksum=c2f3d07d417f54a6116d1581474ed40dc8c6ffd6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:39:08 compute-1 sudo[202548]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:09 compute-1 sudo[202700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhipmchichqzfswlrqlodcobcenqjrla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416748.9961333-2164-42589658908284/AnsiballZ_find.py'
Jan 26 08:39:09 compute-1 sudo[202700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:09 compute-1 python3.9[202702]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 08:39:09 compute-1 sudo[202700]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:10 compute-1 podman[202727]: 2026-01-26 08:39:10.849216425 +0000 UTC m=+0.109919208 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 26 08:39:11 compute-1 podman[202754]: 2026-01-26 08:39:11.810099891 +0000 UTC m=+0.056793592 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 08:39:13 compute-1 podman[202778]: 2026-01-26 08:39:13.814799347 +0000 UTC m=+0.068986615 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 26 08:39:22 compute-1 podman[202798]: 2026-01-26 08:39:22.825494497 +0000 UTC m=+0.077468093 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 08:39:33 compute-1 podman[202826]: 2026-01-26 08:39:33.831348222 +0000 UTC m=+0.083558272 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=4, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 08:39:33 compute-1 systemd[1]: 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67-e191ebfdaca0ff9.service: Main process exited, code=exited, status=1/FAILURE
Jan 26 08:39:33 compute-1 systemd[1]: 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67-e191ebfdaca0ff9.service: Failed with result 'exit-code'.
Jan 26 08:39:34 compute-1 sudo[202970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugzbmtdnhrxlbdbvlqwflojzgvfufgpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416773.8501856-2380-152787949765864/AnsiballZ_podman_container_info.py'
Jan 26 08:39:34 compute-1 sudo[202970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:34 compute-1 python3.9[202972]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Jan 26 08:39:34 compute-1 sudo[202970]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:34 compute-1 sshd-session[202824]: Connection closed by authenticating user root 159.223.236.81 port 55218 [preauth]
Jan 26 08:39:35 compute-1 sudo[203135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbtzuciecqsghbzcdmtrbwndhtehsorq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416774.6696875-2388-61750655698857/AnsiballZ_podman_container_exec.py'
Jan 26 08:39:35 compute-1 sudo[203135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:35 compute-1 python3.9[203137]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 08:39:35 compute-1 systemd[1]: Started libpod-conmon-17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0.scope.
Jan 26 08:39:35 compute-1 podman[203138]: 2026-01-26 08:39:35.412043481 +0000 UTC m=+0.114127419 container exec 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 26 08:39:35 compute-1 podman[203138]: 2026-01-26 08:39:35.423760951 +0000 UTC m=+0.125844889 container exec_died 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:39:35 compute-1 sudo[203135]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:35 compute-1 systemd[1]: libpod-conmon-17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0.scope: Deactivated successfully.
Jan 26 08:39:35 compute-1 podman[203155]: 2026-01-26 08:39:35.522677667 +0000 UTC m=+0.108193136 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 26 08:39:36 compute-1 sudo[203344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubbskrhjafjrehecwyatnkqjmzjstddw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416775.6758792-2396-142246036150901/AnsiballZ_podman_container_exec.py'
Jan 26 08:39:36 compute-1 sudo[203344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:36 compute-1 python3.9[203346]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 08:39:36 compute-1 systemd[1]: Started libpod-conmon-17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0.scope.
Jan 26 08:39:36 compute-1 podman[203347]: 2026-01-26 08:39:36.368555706 +0000 UTC m=+0.116316833 container exec 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:39:36 compute-1 podman[203366]: 2026-01-26 08:39:36.489330975 +0000 UTC m=+0.107438933 container exec_died 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:39:36 compute-1 podman[203347]: 2026-01-26 08:39:36.495971109 +0000 UTC m=+0.243732276 container exec_died 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:39:36 compute-1 systemd[1]: libpod-conmon-17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0.scope: Deactivated successfully.
Jan 26 08:39:36 compute-1 sudo[203344]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:36 compute-1 sshd-session[203187]: Invalid user orangepi from 103.236.95.173 port 40900
Jan 26 08:39:36 compute-1 sshd-session[203187]: Connection closed by invalid user orangepi 103.236.95.173 port 40900 [preauth]
Jan 26 08:39:37 compute-1 sudo[203528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bydqmheqfdrkndwcgqeoqbzylyqscsna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416776.7388465-2404-239152068931209/AnsiballZ_file.py'
Jan 26 08:39:37 compute-1 sudo[203528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:37 compute-1 python3.9[203530]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:39:37 compute-1 sudo[203528]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:37 compute-1 sudo[203680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjchneyaydnkgluvlemkjyfiiatebmnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416777.6001968-2413-53311121182120/AnsiballZ_podman_container_info.py'
Jan 26 08:39:37 compute-1 sudo[203680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:38 compute-1 python3.9[203682]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Jan 26 08:39:38 compute-1 sudo[203680]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:38 compute-1 sudo[203845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eertczrkvfrhgkiutufrlvearuqxfhrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416778.542269-2421-166475888547407/AnsiballZ_podman_container_exec.py'
Jan 26 08:39:38 compute-1 sudo[203845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:39 compute-1 python3.9[203847]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 08:39:39 compute-1 systemd[1]: Started libpod-conmon-c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed.scope.
Jan 26 08:39:39 compute-1 podman[203848]: 2026-01-26 08:39:39.205203163 +0000 UTC m=+0.096194747 container exec c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:39:39 compute-1 podman[203848]: 2026-01-26 08:39:39.242572139 +0000 UTC m=+0.133563753 container exec_died c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:39:39 compute-1 systemd[1]: libpod-conmon-c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed.scope: Deactivated successfully.
Jan 26 08:39:39 compute-1 sudo[203845]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:39 compute-1 sudo[204028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlgurzllbkvurkdjjaptvmocxpaszcih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416779.5003428-2429-35080226607099/AnsiballZ_podman_container_exec.py'
Jan 26 08:39:39 compute-1 sudo[204028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:40 compute-1 python3.9[204030]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 08:39:40 compute-1 systemd[1]: Started libpod-conmon-c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed.scope.
Jan 26 08:39:40 compute-1 podman[204031]: 2026-01-26 08:39:40.159594856 +0000 UTC m=+0.096060663 container exec c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 08:39:40 compute-1 podman[204031]: 2026-01-26 08:39:40.194638625 +0000 UTC m=+0.131104432 container exec_died c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:39:40 compute-1 systemd[1]: libpod-conmon-c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed.scope: Deactivated successfully.
Jan 26 08:39:40 compute-1 sudo[204028]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:40 compute-1 sudo[204211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lulgwbdarnstgzpeigbzqgsqrqxgabat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416780.4793708-2437-180652935339493/AnsiballZ_file.py'
Jan 26 08:39:40 compute-1 sudo[204211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:41 compute-1 python3.9[204213]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:39:41 compute-1 sudo[204211]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:41 compute-1 sudo[204376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uolguenmijbosiqacnjdmtrvulydlwmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416781.339506-2446-223019072794911/AnsiballZ_podman_container_info.py'
Jan 26 08:39:41 compute-1 sudo[204376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:41 compute-1 podman[204337]: 2026-01-26 08:39:41.760529332 +0000 UTC m=+0.130699330 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:39:41 compute-1 python3.9[204385]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Jan 26 08:39:42 compute-1 sudo[204376]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:42 compute-1 sudo[204565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-genckuwgtfgezjmscsyukoomapuwkvia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416782.2322752-2454-128395207475954/AnsiballZ_podman_container_exec.py'
Jan 26 08:39:42 compute-1 sudo[204565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:42 compute-1 podman[204528]: 2026-01-26 08:39:42.643674724 +0000 UTC m=+0.120321998 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 08:39:42 compute-1 python3.9[204583]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 08:39:42 compute-1 systemd[1]: Started libpod-conmon-1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67.scope.
Jan 26 08:39:42 compute-1 podman[204584]: 2026-01-26 08:39:42.948850355 +0000 UTC m=+0.106780915 container exec 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 26 08:39:42 compute-1 podman[204584]: 2026-01-26 08:39:42.979808365 +0000 UTC m=+0.137738955 container exec_died 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 08:39:43 compute-1 sudo[204565]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:43 compute-1 systemd[1]: libpod-conmon-1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67.scope: Deactivated successfully.
Jan 26 08:39:43 compute-1 sudo[204766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwuyawydygtquwajtjvocqtvxtrzccrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416783.1865394-2462-100135531599611/AnsiballZ_podman_container_exec.py'
Jan 26 08:39:43 compute-1 sudo[204766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:43 compute-1 python3.9[204768]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 08:39:43 compute-1 systemd[1]: Started libpod-conmon-1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67.scope.
Jan 26 08:39:43 compute-1 podman[204769]: 2026-01-26 08:39:43.889235121 +0000 UTC m=+0.102749468 container exec 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:39:43 compute-1 podman[204769]: 2026-01-26 08:39:43.925686651 +0000 UTC m=+0.139201008 container exec_died 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 26 08:39:43 compute-1 systemd[1]: libpod-conmon-1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67.scope: Deactivated successfully.
Jan 26 08:39:43 compute-1 sudo[204766]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:44 compute-1 podman[204787]: 2026-01-26 08:39:44.023475424 +0000 UTC m=+0.126733635 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 08:39:44 compute-1 sudo[204971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnkvnofpanraymcshhvkrailcsohpnsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416784.177441-2470-163003807153690/AnsiballZ_file.py'
Jan 26 08:39:44 compute-1 sudo[204971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:44 compute-1 python3.9[204973]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:39:44 compute-1 sudo[204971]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:45 compute-1 sudo[205123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzzpbgwqkacnyiamqqzqyabbwzpgdoqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416784.9911482-2479-193313371036565/AnsiballZ_podman_container_info.py'
Jan 26 08:39:45 compute-1 sudo[205123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:45 compute-1 python3.9[205125]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Jan 26 08:39:45 compute-1 sudo[205123]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:46 compute-1 sudo[205289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwradbmajsyxljsoyraftcusugtfujlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416785.898698-2487-35643386292362/AnsiballZ_podman_container_exec.py'
Jan 26 08:39:46 compute-1 sudo[205289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:46 compute-1 python3.9[205291]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 08:39:46 compute-1 systemd[1]: Started libpod-conmon-56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4.scope.
Jan 26 08:39:46 compute-1 podman[205292]: 2026-01-26 08:39:46.625942943 +0000 UTC m=+0.105007993 container exec 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 08:39:46 compute-1 podman[205292]: 2026-01-26 08:39:46.657067648 +0000 UTC m=+0.136132708 container exec_died 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 08:39:46 compute-1 systemd[1]: libpod-conmon-56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4.scope: Deactivated successfully.
Jan 26 08:39:46 compute-1 sudo[205289]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:47 compute-1 sudo[205474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsvoytdytjolnqzygljnpdzxogbejhih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416786.9391568-2495-50973211746285/AnsiballZ_podman_container_exec.py'
Jan 26 08:39:47 compute-1 sudo[205474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:47 compute-1 python3.9[205476]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 08:39:47 compute-1 systemd[1]: Started libpod-conmon-56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4.scope.
Jan 26 08:39:47 compute-1 podman[205477]: 2026-01-26 08:39:47.783400509 +0000 UTC m=+0.091899952 container exec 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 08:39:47 compute-1 podman[205477]: 2026-01-26 08:39:47.816364708 +0000 UTC m=+0.124864141 container exec_died 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 08:39:47 compute-1 systemd[1]: libpod-conmon-56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4.scope: Deactivated successfully.
Jan 26 08:39:47 compute-1 sudo[205474]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:48 compute-1 sudo[205656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uerxorbikbdhdarqppfvdauqbedutthq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416788.2781525-2503-12777874028699/AnsiballZ_file.py'
Jan 26 08:39:48 compute-1 sudo[205656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:48 compute-1 python3.9[205658]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:39:49 compute-1 sudo[205656]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:49 compute-1 sudo[205808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewkkjpfucmufwurledrkiubxcnkzlrxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416789.231491-2512-91077810600667/AnsiballZ_podman_container_info.py'
Jan 26 08:39:49 compute-1 sudo[205808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:49 compute-1 python3.9[205810]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Jan 26 08:39:49 compute-1 sudo[205808]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:50 compute-1 sudo[205973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udqutgvxictzdiroauvmfalwxawkandl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416790.0582256-2520-176709466116782/AnsiballZ_podman_container_exec.py'
Jan 26 08:39:50 compute-1 sudo[205973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:50 compute-1 python3.9[205975]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 08:39:50 compute-1 systemd[1]: Started libpod-conmon-d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364.scope.
Jan 26 08:39:50 compute-1 podman[205976]: 2026-01-26 08:39:50.760335125 +0000 UTC m=+0.106806236 container exec d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 08:39:50 compute-1 podman[205976]: 2026-01-26 08:39:50.796818065 +0000 UTC m=+0.143289176 container exec_died d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 08:39:50 compute-1 systemd[1]: libpod-conmon-d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364.scope: Deactivated successfully.
Jan 26 08:39:50 compute-1 sudo[205973]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:51 compute-1 sudo[206159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgikzvigxireshxavmhiipmitpjjjclo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416791.1432197-2528-31129529275648/AnsiballZ_podman_container_exec.py'
Jan 26 08:39:51 compute-1 sudo[206159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:51 compute-1 python3.9[206161]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 08:39:51 compute-1 systemd[1]: Started libpod-conmon-d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364.scope.
Jan 26 08:39:51 compute-1 podman[206162]: 2026-01-26 08:39:51.900457737 +0000 UTC m=+0.176209663 container exec d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 08:39:51 compute-1 nova_compute[183083]: 2026-01-26 08:39:51.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:39:51 compute-1 nova_compute[183083]: 2026-01-26 08:39:51.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:39:51 compute-1 nova_compute[183083]: 2026-01-26 08:39:51.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:39:51 compute-1 nova_compute[183083]: 2026-01-26 08:39:51.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:39:51 compute-1 nova_compute[183083]: 2026-01-26 08:39:51.965 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 08:39:51 compute-1 nova_compute[183083]: 2026-01-26 08:39:51.965 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:39:51 compute-1 nova_compute[183083]: 2026-01-26 08:39:51.966 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:39:51 compute-1 podman[206182]: 2026-01-26 08:39:51.9903804 +0000 UTC m=+0.071408987 container exec_died d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 08:39:51 compute-1 podman[206162]: 2026-01-26 08:39:51.997545408 +0000 UTC m=+0.273297274 container exec_died d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 08:39:52 compute-1 systemd[1]: libpod-conmon-d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364.scope: Deactivated successfully.
Jan 26 08:39:52 compute-1 sudo[206159]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:52 compute-1 sudo[206344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwtxdvwyrsbqycyzdgoqvdlkghkpjvms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416792.2900083-2536-90215927149510/AnsiballZ_file.py'
Jan 26 08:39:52 compute-1 sudo[206344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:52 compute-1 python3.9[206346]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:39:52 compute-1 sudo[206344]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:52 compute-1 nova_compute[183083]: 2026-01-26 08:39:52.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:39:52 compute-1 nova_compute[183083]: 2026-01-26 08:39:52.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:39:52 compute-1 nova_compute[183083]: 2026-01-26 08:39:52.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:39:52 compute-1 nova_compute[183083]: 2026-01-26 08:39:52.985 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:39:52 compute-1 nova_compute[183083]: 2026-01-26 08:39:52.985 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:39:52 compute-1 nova_compute[183083]: 2026-01-26 08:39:52.985 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:39:52 compute-1 nova_compute[183083]: 2026-01-26 08:39:52.985 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:39:53 compute-1 nova_compute[183083]: 2026-01-26 08:39:53.216 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:39:53 compute-1 nova_compute[183083]: 2026-01-26 08:39:53.216 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13999MB free_disk=113.13657760620117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:39:53 compute-1 nova_compute[183083]: 2026-01-26 08:39:53.217 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:39:53 compute-1 nova_compute[183083]: 2026-01-26 08:39:53.217 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:39:53 compute-1 nova_compute[183083]: 2026-01-26 08:39:53.344 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:39:53 compute-1 nova_compute[183083]: 2026-01-26 08:39:53.344 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:39:53 compute-1 nova_compute[183083]: 2026-01-26 08:39:53.367 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:39:53 compute-1 nova_compute[183083]: 2026-01-26 08:39:53.380 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:39:53 compute-1 nova_compute[183083]: 2026-01-26 08:39:53.383 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:39:53 compute-1 nova_compute[183083]: 2026-01-26 08:39:53.383 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:39:53 compute-1 sudo[206513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aabfvsjwzeipcrhcxvtqpkxjsdkmqmdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416793.199812-2545-76641199314409/AnsiballZ_podman_container_info.py'
Jan 26 08:39:53 compute-1 sudo[206513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:53 compute-1 podman[206470]: 2026-01-26 08:39:53.568386691 +0000 UTC m=+0.115603312 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 08:39:53 compute-1 python3.9[206522]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Jan 26 08:39:53 compute-1 sudo[206513]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:54 compute-1 nova_compute[183083]: 2026-01-26 08:39:54.384 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:39:54 compute-1 nova_compute[183083]: 2026-01-26 08:39:54.385 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:39:54 compute-1 nova_compute[183083]: 2026-01-26 08:39:54.385 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:39:54 compute-1 sudo[206685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhgxzdfwtfxubmpavffejsiefbuhkooj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416794.1282802-2553-193037137499950/AnsiballZ_podman_container_exec.py'
Jan 26 08:39:54 compute-1 sudo[206685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:54 compute-1 python3.9[206687]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 08:39:54 compute-1 systemd[1]: Started libpod-conmon-a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e.scope.
Jan 26 08:39:54 compute-1 podman[206688]: 2026-01-26 08:39:54.886430095 +0000 UTC m=+0.134605994 container exec a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, config_id=openstack_network_exporter)
Jan 26 08:39:54 compute-1 podman[206688]: 2026-01-26 08:39:54.919409453 +0000 UTC m=+0.167585262 container exec_died a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public)
Jan 26 08:39:54 compute-1 sudo[206685]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:54 compute-1 systemd[1]: libpod-conmon-a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e.scope: Deactivated successfully.
Jan 26 08:39:55 compute-1 sudo[206868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcgkogghxhthgijcpjxrjwwjooylvwyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416795.1494565-2561-231423306583234/AnsiballZ_podman_container_exec.py'
Jan 26 08:39:55 compute-1 sudo[206868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:55 compute-1 python3.9[206870]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 08:39:55 compute-1 systemd[1]: Started libpod-conmon-a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e.scope.
Jan 26 08:39:55 compute-1 podman[206871]: 2026-01-26 08:39:55.835287536 +0000 UTC m=+0.096926389 container exec a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Jan 26 08:39:55 compute-1 podman[206871]: 2026-01-26 08:39:55.869532211 +0000 UTC m=+0.131171004 container exec_died a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, distribution-scope=public, version=9.6, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 26 08:39:55 compute-1 systemd[1]: libpod-conmon-a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e.scope: Deactivated successfully.
Jan 26 08:39:55 compute-1 sudo[206868]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:56 compute-1 sudo[207050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxzaakeebxsmrboqdjiucxaqmwrdfsgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416796.1242938-2569-276680368788140/AnsiballZ_file.py'
Jan 26 08:39:56 compute-1 sudo[207050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:56 compute-1 python3.9[207052]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:39:56 compute-1 sudo[207050]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:57 compute-1 sudo[207202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyssmdeihpjuvrfkauhkvrpollonqyrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416797.0416732-2580-11543003342617/AnsiballZ_file.py'
Jan 26 08:39:57 compute-1 sudo[207202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:57 compute-1 python3.9[207204]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:39:57 compute-1 sudo[207202]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:58 compute-1 sudo[207354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhnqpmjxcduubzqjfobuxfvevgemvwxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416797.8461065-2596-131387448152798/AnsiballZ_stat.py'
Jan 26 08:39:58 compute-1 sudo[207354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:58 compute-1 python3.9[207356]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:39:58 compute-1 sudo[207354]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:58 compute-1 sudo[207477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yobequeniqiootozzjkgrjhaulrivitx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416797.8461065-2596-131387448152798/AnsiballZ_copy.py'
Jan 26 08:39:58 compute-1 sudo[207477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:39:59 compute-1 python3.9[207479]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769416797.8461065-2596-131387448152798/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:39:59 compute-1 sudo[207477]: pam_unix(sudo:session): session closed for user root
Jan 26 08:39:59 compute-1 sudo[207629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-araseqbpuorvnwuykosptisfyeueauhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416799.4220543-2628-105783246785767/AnsiballZ_file.py'
Jan 26 08:39:59 compute-1 sudo[207629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:00 compute-1 python3.9[207631]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:40:00 compute-1 sudo[207629]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:00 compute-1 sudo[207781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmuunffifpjkkznzpueuwwoxrbipiyrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416800.2331824-2644-108704553447025/AnsiballZ_stat.py'
Jan 26 08:40:00 compute-1 sudo[207781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:00 compute-1 python3.9[207783]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:40:00 compute-1 sudo[207781]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:01 compute-1 sudo[207859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfxkoujxqzjplhmrajdaunhhpdvepdhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416800.2331824-2644-108704553447025/AnsiballZ_file.py'
Jan 26 08:40:01 compute-1 sudo[207859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:01 compute-1 python3.9[207861]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:40:01 compute-1 sudo[207859]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:01 compute-1 sudo[208011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfjavzgdzuzutfiraiyssjhpxozpgdkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416801.5391247-2668-118482760006147/AnsiballZ_stat.py'
Jan 26 08:40:01 compute-1 sudo[208011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:02 compute-1 python3.9[208013]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:40:02 compute-1 sudo[208011]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:02 compute-1 sudo[208089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsibhxrkfkffxmykugiykquxrbrmrnaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416801.5391247-2668-118482760006147/AnsiballZ_file.py'
Jan 26 08:40:02 compute-1 sudo[208089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:02 compute-1 python3.9[208091]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.1fhnvxpy recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:40:02 compute-1 sudo[208089]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:03 compute-1 sudo[208241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihwcqcvvhysfsqjcgvciacywxkpsynwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416802.925762-2692-96668785347930/AnsiballZ_stat.py'
Jan 26 08:40:03 compute-1 sudo[208241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:03 compute-1 python3.9[208243]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:40:03 compute-1 sudo[208241]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.737 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:40:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:40:03 compute-1 sudo[208319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgbxjtuxnrbfikgfrwjacgdeocuscdmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416802.925762-2692-96668785347930/AnsiballZ_file.py'
Jan 26 08:40:03 compute-1 sudo[208319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:04 compute-1 python3.9[208321]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:40:04 compute-1 sudo[208319]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:04 compute-1 sudo[208483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwiipqpylptxgfwwibkkhupiywwcrbpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416804.2894268-2718-105800502208993/AnsiballZ_command.py'
Jan 26 08:40:04 compute-1 sudo[208483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:04 compute-1 podman[208445]: 2026-01-26 08:40:04.723407962 +0000 UTC m=+0.124274454 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute)
Jan 26 08:40:04 compute-1 python3.9[208491]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:40:04 compute-1 sudo[208483]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:40:05.291 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:40:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:40:05.293 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:40:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:40:05.293 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:40:05 compute-1 sudo[208653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxdiameazwawotyrveekolvjbiciyhgx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769416805.119746-2734-45449450446512/AnsiballZ_edpm_nftables_from_files.py'
Jan 26 08:40:05 compute-1 sudo[208653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:05 compute-1 podman[208617]: 2026-01-26 08:40:05.687476316 +0000 UTC m=+0.089792881 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 26 08:40:05 compute-1 python3[208662]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 08:40:05 compute-1 sudo[208653]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:06 compute-1 sudo[208816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgvlsvyxdmzjhpffqaaxtmzwakxmaxoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416806.1125116-2751-218794707346675/AnsiballZ_stat.py'
Jan 26 08:40:06 compute-1 sudo[208816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:06 compute-1 python3.9[208818]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:40:06 compute-1 sudo[208816]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:06 compute-1 sudo[208894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axhneczotvsjrqlmmqbktlhaddyohgdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416806.1125116-2751-218794707346675/AnsiballZ_file.py'
Jan 26 08:40:06 compute-1 sudo[208894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:07 compute-1 python3.9[208896]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:40:07 compute-1 sudo[208894]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:07 compute-1 sudo[209048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdzyrfwxukmxfugkvjyeqmhnrjaaxovk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416807.507447-2774-210260469764514/AnsiballZ_stat.py'
Jan 26 08:40:07 compute-1 sudo[209048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:08 compute-1 python3.9[209050]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:40:08 compute-1 sudo[209048]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:08 compute-1 sudo[209126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptuhmaynmykemnewcpuesnlapxaletvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416807.507447-2774-210260469764514/AnsiballZ_file.py'
Jan 26 08:40:08 compute-1 sudo[209126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:08 compute-1 sshd-session[208897]: Connection closed by authenticating user root 103.236.95.173 port 48480 [preauth]
Jan 26 08:40:08 compute-1 python3.9[209128]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:40:08 compute-1 sudo[209126]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:09 compute-1 sudo[209278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wooumjhbqdrzacqpwjdmdwsbytttzbjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416808.8142438-2798-280804552636981/AnsiballZ_stat.py'
Jan 26 08:40:09 compute-1 sudo[209278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:09 compute-1 python3.9[209280]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:40:09 compute-1 sudo[209278]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:09 compute-1 sudo[209356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjxgpisibmpejguzasqfjnkhtuvelqko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416808.8142438-2798-280804552636981/AnsiballZ_file.py'
Jan 26 08:40:09 compute-1 sudo[209356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:09 compute-1 python3.9[209358]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:40:09 compute-1 sudo[209356]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:10 compute-1 sudo[209508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iftewmjpzqqpzvugjsypjjypbfeqdtwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416810.1680174-2822-13406114140353/AnsiballZ_stat.py'
Jan 26 08:40:10 compute-1 sudo[209508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:10 compute-1 python3.9[209510]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:40:10 compute-1 sudo[209508]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:11 compute-1 sudo[209586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfeqtozoqlgqzyhuienppunmfreljqeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416810.1680174-2822-13406114140353/AnsiballZ_file.py'
Jan 26 08:40:11 compute-1 sudo[209586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:11 compute-1 python3.9[209588]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:40:11 compute-1 sudo[209586]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:12 compute-1 sudo[209751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybzgbdgozmxywkixfemqzkpkqnqfxuhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416811.5257528-2846-223608693811036/AnsiballZ_stat.py'
Jan 26 08:40:12 compute-1 sudo[209751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:12 compute-1 podman[209712]: 2026-01-26 08:40:12.08043731 +0000 UTC m=+0.136137158 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:40:12 compute-1 python3.9[209759]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 08:40:12 compute-1 sudo[209751]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:12 compute-1 sudo[209891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmmanqlkwaybeenhbnyqtqatmqtaxalp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416811.5257528-2846-223608693811036/AnsiballZ_copy.py'
Jan 26 08:40:12 compute-1 sudo[209891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:12 compute-1 podman[209893]: 2026-01-26 08:40:12.794380193 +0000 UTC m=+0.084128606 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 08:40:12 compute-1 python3.9[209894]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769416811.5257528-2846-223608693811036/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:40:12 compute-1 sudo[209891]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:13 compute-1 sudo[210067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjuwymedsmahferkslcxgpgbpedqzzpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416813.2330577-2876-271781914559664/AnsiballZ_file.py'
Jan 26 08:40:13 compute-1 sudo[210067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:13 compute-1 python3.9[210069]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:40:13 compute-1 sudo[210067]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:14 compute-1 sudo[210234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohsvpupiklyvhqyjuqehtjlzbmzjqqmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416814.072856-2892-81957450470362/AnsiballZ_command.py'
Jan 26 08:40:14 compute-1 sudo[210234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:14 compute-1 podman[210193]: 2026-01-26 08:40:14.475348966 +0000 UTC m=+0.086849225 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:40:14 compute-1 python3.9[210240]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:40:14 compute-1 sudo[210234]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:15 compute-1 sudo[210393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esjbdhhzzinxdcncfoljevkxyngukrwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416814.9670708-2908-165595456487077/AnsiballZ_blockinfile.py'
Jan 26 08:40:15 compute-1 sudo[210393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:15 compute-1 python3.9[210395]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:40:15 compute-1 sudo[210393]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:16 compute-1 sudo[210545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auddwlebmqyijkrunxybgajxrblhapdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416816.0020509-2926-23206672090544/AnsiballZ_command.py'
Jan 26 08:40:16 compute-1 sudo[210545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:16 compute-1 python3.9[210547]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:40:16 compute-1 sudo[210545]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:17 compute-1 sudo[210698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdehuwkumtilmsgkkvfakkftscoaehoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416816.7991178-2942-15006186953046/AnsiballZ_stat.py'
Jan 26 08:40:17 compute-1 sudo[210698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:17 compute-1 python3.9[210700]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 08:40:17 compute-1 sudo[210698]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:17 compute-1 sudo[210852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtulrkpkywuoskeziyeysltsxqcrbusl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416817.6187756-2958-38134904876651/AnsiballZ_command.py'
Jan 26 08:40:17 compute-1 sudo[210852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:18 compute-1 python3.9[210854]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 08:40:18 compute-1 sudo[210852]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:18 compute-1 sudo[211007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnwoyhohbltizumptldhnolqqzgrwnoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769416818.382216-2974-145230477722907/AnsiballZ_file.py'
Jan 26 08:40:18 compute-1 sudo[211007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:40:18 compute-1 python3.9[211009]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 08:40:18 compute-1 sudo[211007]: pam_unix(sudo:session): session closed for user root
Jan 26 08:40:19 compute-1 sshd-session[183410]: Connection closed by 192.168.122.30 port 55050
Jan 26 08:40:19 compute-1 sshd-session[183407]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:40:19 compute-1 systemd[1]: session-27.scope: Deactivated successfully.
Jan 26 08:40:19 compute-1 systemd[1]: session-27.scope: Consumed 2min 13.191s CPU time.
Jan 26 08:40:19 compute-1 systemd-logind[788]: Session 27 logged out. Waiting for processes to exit.
Jan 26 08:40:19 compute-1 systemd-logind[788]: Removed session 27.
Jan 26 08:40:23 compute-1 podman[211034]: 2026-01-26 08:40:23.814478937 +0000 UTC m=+0.079294829 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 08:40:35 compute-1 podman[211062]: 2026-01-26 08:40:35.8230392 +0000 UTC m=+0.077155887 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.7, config_id=openstack_network_exporter)
Jan 26 08:40:35 compute-1 podman[211061]: 2026-01-26 08:40:35.834626805 +0000 UTC m=+0.088818845 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 26 08:40:37 compute-1 sshd-session[211059]: Connection closed by authenticating user root 159.223.236.81 port 51362 [preauth]
Jan 26 08:40:40 compute-1 sshd-session[211102]: Connection closed by authenticating user root 103.236.95.173 port 54916 [preauth]
Jan 26 08:40:42 compute-1 podman[211104]: 2026-01-26 08:40:42.870310193 +0000 UTC m=+0.128387062 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 08:40:43 compute-1 podman[211130]: 2026-01-26 08:40:43.000695861 +0000 UTC m=+0.102472580 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 08:40:44 compute-1 podman[211154]: 2026-01-26 08:40:44.815276969 +0000 UTC m=+0.074862870 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 08:40:51 compute-1 nova_compute[183083]: 2026-01-26 08:40:51.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:40:51 compute-1 nova_compute[183083]: 2026-01-26 08:40:51.971 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:40:51 compute-1 nova_compute[183083]: 2026-01-26 08:40:51.971 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:40:52 compute-1 nova_compute[183083]: 2026-01-26 08:40:52.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:40:53 compute-1 nova_compute[183083]: 2026-01-26 08:40:53.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:40:53 compute-1 nova_compute[183083]: 2026-01-26 08:40:53.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:40:53 compute-1 nova_compute[183083]: 2026-01-26 08:40:53.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:40:53 compute-1 nova_compute[183083]: 2026-01-26 08:40:53.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:40:53 compute-1 nova_compute[183083]: 2026-01-26 08:40:53.965 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 08:40:53 compute-1 nova_compute[183083]: 2026-01-26 08:40:53.966 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:40:54 compute-1 nova_compute[183083]: 2026-01-26 08:40:54.009 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:40:54 compute-1 nova_compute[183083]: 2026-01-26 08:40:54.009 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:40:54 compute-1 nova_compute[183083]: 2026-01-26 08:40:54.010 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:40:54 compute-1 nova_compute[183083]: 2026-01-26 08:40:54.010 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:40:54 compute-1 nova_compute[183083]: 2026-01-26 08:40:54.226 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:40:54 compute-1 nova_compute[183083]: 2026-01-26 08:40:54.228 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=14106MB free_disk=113.13611221313477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:40:54 compute-1 nova_compute[183083]: 2026-01-26 08:40:54.229 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:40:54 compute-1 nova_compute[183083]: 2026-01-26 08:40:54.230 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:40:54 compute-1 nova_compute[183083]: 2026-01-26 08:40:54.287 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:40:54 compute-1 nova_compute[183083]: 2026-01-26 08:40:54.288 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:40:54 compute-1 nova_compute[183083]: 2026-01-26 08:40:54.310 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:40:54 compute-1 nova_compute[183083]: 2026-01-26 08:40:54.341 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:40:54 compute-1 nova_compute[183083]: 2026-01-26 08:40:54.343 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:40:54 compute-1 nova_compute[183083]: 2026-01-26 08:40:54.344 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:40:54 compute-1 podman[211173]: 2026-01-26 08:40:54.816912611 +0000 UTC m=+0.081437881 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 08:40:55 compute-1 nova_compute[183083]: 2026-01-26 08:40:55.329 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:40:55 compute-1 nova_compute[183083]: 2026-01-26 08:40:55.330 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:40:55 compute-1 nova_compute[183083]: 2026-01-26 08:40:55.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:40:55 compute-1 nova_compute[183083]: 2026-01-26 08:40:55.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:41:02 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:41:02.365 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:41:02 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:41:02.367 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:41:02 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:41:02.367 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:41:03 compute-1 ovn_controller[95352]: 2026-01-26T08:41:03Z|00027|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:41:03 compute-1 ovn_controller[95352]: 2026-01-26T08:41:03Z|00028|pinctrl|WARN|IGMP Querier enabled with invalid ETH src address
Jan 26 08:41:04 compute-1 ovn_controller[95352]: 2026-01-26T08:41:04Z|00029|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:41:04 compute-1 ovn_controller[95352]: 2026-01-26T08:41:04Z|00030|pinctrl|WARN|IGMP Querier enabled with invalid ETH src address
Jan 26 08:41:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:41:05.292 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:41:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:41:05.293 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:41:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:41:05.293 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:41:06 compute-1 ovn_controller[95352]: 2026-01-26T08:41:06Z|00031|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:41:06 compute-1 podman[211197]: 2026-01-26 08:41:06.838835982 +0000 UTC m=+0.095297312 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:41:06 compute-1 podman[211198]: 2026-01-26 08:41:06.860701386 +0000 UTC m=+0.111668177 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 26 08:41:13 compute-1 podman[211238]: 2026-01-26 08:41:13.824705446 +0000 UTC m=+0.073642145 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 08:41:13 compute-1 podman[211237]: 2026-01-26 08:41:13.88384389 +0000 UTC m=+0.116557529 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:41:13 compute-1 sshd-session[211235]: Connection closed by authenticating user root 103.236.95.173 port 33662 [preauth]
Jan 26 08:41:15 compute-1 podman[211285]: 2026-01-26 08:41:15.783863353 +0000 UTC m=+0.051465303 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Jan 26 08:41:25 compute-1 podman[211304]: 2026-01-26 08:41:25.844807237 +0000 UTC m=+0.086594974 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 08:41:37 compute-1 sshd-session[211329]: Connection closed by authenticating user root 159.223.236.81 port 54886 [preauth]
Jan 26 08:41:37 compute-1 podman[211332]: 2026-01-26 08:41:37.840253356 +0000 UTC m=+0.095618578 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 08:41:37 compute-1 podman[211331]: 2026-01-26 08:41:37.855158172 +0000 UTC m=+0.119898148 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:41:44 compute-1 podman[211373]: 2026-01-26 08:41:44.871133571 +0000 UTC m=+0.110217005 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 08:41:44 compute-1 podman[211372]: 2026-01-26 08:41:44.873587073 +0000 UTC m=+0.126879293 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:41:46 compute-1 podman[211422]: 2026-01-26 08:41:46.974308553 +0000 UTC m=+0.050919100 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 26 08:41:50 compute-1 nova_compute[183083]: 2026-01-26 08:41:50.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:41:50 compute-1 nova_compute[183083]: 2026-01-26 08:41:50.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 26 08:41:50 compute-1 nova_compute[183083]: 2026-01-26 08:41:50.974 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 26 08:41:50 compute-1 nova_compute[183083]: 2026-01-26 08:41:50.974 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:41:50 compute-1 nova_compute[183083]: 2026-01-26 08:41:50.975 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 26 08:41:50 compute-1 nova_compute[183083]: 2026-01-26 08:41:50.990 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:41:52 compute-1 nova_compute[183083]: 2026-01-26 08:41:52.009 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:41:53 compute-1 nova_compute[183083]: 2026-01-26 08:41:53.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:41:53 compute-1 nova_compute[183083]: 2026-01-26 08:41:53.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:41:54 compute-1 nova_compute[183083]: 2026-01-26 08:41:54.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:41:54 compute-1 nova_compute[183083]: 2026-01-26 08:41:54.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:41:54 compute-1 nova_compute[183083]: 2026-01-26 08:41:54.950 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:41:54 compute-1 nova_compute[183083]: 2026-01-26 08:41:54.950 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:41:54 compute-1 nova_compute[183083]: 2026-01-26 08:41:54.967 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 08:41:54 compute-1 nova_compute[183083]: 2026-01-26 08:41:54.968 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:41:54 compute-1 nova_compute[183083]: 2026-01-26 08:41:54.968 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:41:54 compute-1 nova_compute[183083]: 2026-01-26 08:41:54.989 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:41:54 compute-1 nova_compute[183083]: 2026-01-26 08:41:54.989 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:41:54 compute-1 nova_compute[183083]: 2026-01-26 08:41:54.989 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:41:54 compute-1 nova_compute[183083]: 2026-01-26 08:41:54.989 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:41:55 compute-1 nova_compute[183083]: 2026-01-26 08:41:55.161 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:41:55 compute-1 nova_compute[183083]: 2026-01-26 08:41:55.162 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=14132MB free_disk=113.13609313964844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:41:55 compute-1 nova_compute[183083]: 2026-01-26 08:41:55.162 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:41:55 compute-1 nova_compute[183083]: 2026-01-26 08:41:55.163 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:41:55 compute-1 nova_compute[183083]: 2026-01-26 08:41:55.266 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:41:55 compute-1 nova_compute[183083]: 2026-01-26 08:41:55.266 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:41:55 compute-1 nova_compute[183083]: 2026-01-26 08:41:55.345 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing inventories for resource provider 5203935e-446c-4e03-93fa-4c60d651e045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 26 08:41:55 compute-1 nova_compute[183083]: 2026-01-26 08:41:55.419 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating ProviderTree inventory for provider 5203935e-446c-4e03-93fa-4c60d651e045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 26 08:41:55 compute-1 nova_compute[183083]: 2026-01-26 08:41:55.420 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating inventory in ProviderTree for provider 5203935e-446c-4e03-93fa-4c60d651e045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 08:41:55 compute-1 nova_compute[183083]: 2026-01-26 08:41:55.454 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing aggregate associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 26 08:41:55 compute-1 nova_compute[183083]: 2026-01-26 08:41:55.481 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing trait associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 26 08:41:55 compute-1 nova_compute[183083]: 2026-01-26 08:41:55.502 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:41:55 compute-1 nova_compute[183083]: 2026-01-26 08:41:55.536 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:41:55 compute-1 nova_compute[183083]: 2026-01-26 08:41:55.539 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:41:55 compute-1 nova_compute[183083]: 2026-01-26 08:41:55.540 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.377s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:41:56 compute-1 podman[211441]: 2026-01-26 08:41:56.790914967 +0000 UTC m=+0.054682131 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 08:41:57 compute-1 nova_compute[183083]: 2026-01-26 08:41:57.524 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:41:57 compute-1 nova_compute[183083]: 2026-01-26 08:41:57.524 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:41:57 compute-1 nova_compute[183083]: 2026-01-26 08:41:57.524 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:42:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:42:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:42:05.293 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:42:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:42:05.294 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:42:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:42:05.294 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:42:06 compute-1 ovn_controller[95352]: 2026-01-26T08:42:06Z|00032|pinctrl|WARN|Dropped 21 log messages in last 60 seconds (most recently, 4 seconds ago) due to excessive rate
Jan 26 08:42:06 compute-1 ovn_controller[95352]: 2026-01-26T08:42:06Z|00033|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:42:08 compute-1 podman[211465]: 2026-01-26 08:42:08.827676103 +0000 UTC m=+0.099378878 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:42:08 compute-1 podman[211466]: 2026-01-26 08:42:08.863901123 +0000 UTC m=+0.128142050 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7)
Jan 26 08:42:15 compute-1 podman[211505]: 2026-01-26 08:42:15.842048295 +0000 UTC m=+0.102764607 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:42:15 compute-1 podman[211506]: 2026-01-26 08:42:15.848349469 +0000 UTC m=+0.097624857 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 08:42:17 compute-1 podman[211555]: 2026-01-26 08:42:17.831292784 +0000 UTC m=+0.093554177 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 08:42:27 compute-1 podman[211574]: 2026-01-26 08:42:27.806447115 +0000 UTC m=+0.071886874 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 08:42:38 compute-1 sshd-session[211598]: Connection closed by authenticating user root 159.223.236.81 port 54372 [preauth]
Jan 26 08:42:39 compute-1 podman[211601]: 2026-01-26 08:42:39.81877537 +0000 UTC m=+0.060374627 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container)
Jan 26 08:42:39 compute-1 podman[211600]: 2026-01-26 08:42:39.827377734 +0000 UTC m=+0.086225291 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 26 08:42:46 compute-1 podman[211645]: 2026-01-26 08:42:46.80927341 +0000 UTC m=+0.066832400 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 08:42:47 compute-1 podman[211644]: 2026-01-26 08:42:47.314367652 +0000 UTC m=+0.562768132 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 26 08:42:48 compute-1 podman[211696]: 2026-01-26 08:42:48.837383109 +0000 UTC m=+0.085007066 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 26 08:42:53 compute-1 nova_compute[183083]: 2026-01-26 08:42:53.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:42:53 compute-1 nova_compute[183083]: 2026-01-26 08:42:53.953 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:42:54 compute-1 nova_compute[183083]: 2026-01-26 08:42:54.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:42:54 compute-1 nova_compute[183083]: 2026-01-26 08:42:54.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:42:54 compute-1 nova_compute[183083]: 2026-01-26 08:42:54.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:42:54 compute-1 nova_compute[183083]: 2026-01-26 08:42:54.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:42:54 compute-1 nova_compute[183083]: 2026-01-26 08:42:54.973 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:42:54 compute-1 nova_compute[183083]: 2026-01-26 08:42:54.974 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:42:54 compute-1 nova_compute[183083]: 2026-01-26 08:42:54.974 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:42:54 compute-1 nova_compute[183083]: 2026-01-26 08:42:54.975 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:42:55 compute-1 nova_compute[183083]: 2026-01-26 08:42:55.212 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:42:55 compute-1 nova_compute[183083]: 2026-01-26 08:42:55.214 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=14121MB free_disk=113.13611221313477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:42:55 compute-1 nova_compute[183083]: 2026-01-26 08:42:55.215 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:42:55 compute-1 nova_compute[183083]: 2026-01-26 08:42:55.215 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:42:55 compute-1 nova_compute[183083]: 2026-01-26 08:42:55.283 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:42:55 compute-1 nova_compute[183083]: 2026-01-26 08:42:55.284 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:42:55 compute-1 nova_compute[183083]: 2026-01-26 08:42:55.309 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:42:55 compute-1 nova_compute[183083]: 2026-01-26 08:42:55.327 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:42:55 compute-1 nova_compute[183083]: 2026-01-26 08:42:55.330 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:42:55 compute-1 nova_compute[183083]: 2026-01-26 08:42:55.330 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:42:56 compute-1 nova_compute[183083]: 2026-01-26 08:42:56.332 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:42:56 compute-1 nova_compute[183083]: 2026-01-26 08:42:56.332 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:42:56 compute-1 nova_compute[183083]: 2026-01-26 08:42:56.332 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:42:56 compute-1 nova_compute[183083]: 2026-01-26 08:42:56.345 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 08:42:56 compute-1 sshd-session[211371]: Connection reset by 103.236.95.173 port 37602 [preauth]
Jan 26 08:42:56 compute-1 nova_compute[183083]: 2026-01-26 08:42:56.959 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:42:57 compute-1 nova_compute[183083]: 2026-01-26 08:42:57.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:42:57 compute-1 nova_compute[183083]: 2026-01-26 08:42:57.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:42:58 compute-1 podman[211715]: 2026-01-26 08:42:58.797281357 +0000 UTC m=+0.059982356 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 08:42:58 compute-1 nova_compute[183083]: 2026-01-26 08:42:58.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:42:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:42:59.742 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:42:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:42:59.743 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:43:01 compute-1 anacron[29975]: Job `cron.daily' started
Jan 26 08:43:01 compute-1 anacron[29975]: Job `cron.daily' terminated
Jan 26 08:43:01 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:43:01.746 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:43:03 compute-1 ovn_controller[95352]: 2026-01-26T08:43:03Z|00034|pinctrl|WARN|Dropped 35 log messages in last 57 seconds (most recently, 2 seconds ago) due to excessive rate
Jan 26 08:43:03 compute-1 ovn_controller[95352]: 2026-01-26T08:43:03Z|00035|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:43:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:43:05.294 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:43:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:43:05.295 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:43:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:43:05.295 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:43:10 compute-1 podman[211741]: 2026-01-26 08:43:10.808241816 +0000 UTC m=+0.076616918 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 08:43:10 compute-1 podman[211742]: 2026-01-26 08:43:10.844556468 +0000 UTC m=+0.097889473 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, config_id=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal)
Jan 26 08:43:17 compute-1 podman[211782]: 2026-01-26 08:43:17.82843351 +0000 UTC m=+0.081242020 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 08:43:17 compute-1 podman[211781]: 2026-01-26 08:43:17.902492214 +0000 UTC m=+0.160503362 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 26 08:43:19 compute-1 podman[211829]: 2026-01-26 08:43:19.835682416 +0000 UTC m=+0.095073712 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 08:43:28 compute-1 sshd-session[211850]: error: kex_exchange_identification: read: Connection reset by peer
Jan 26 08:43:28 compute-1 sshd-session[211850]: Connection reset by 176.120.22.52 port 56351
Jan 26 08:43:29 compute-1 podman[211851]: 2026-01-26 08:43:29.786143547 +0000 UTC m=+0.059611305 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 08:43:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:43:37.538 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:43:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:43:37.540 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:43:38 compute-1 sshd-session[211876]: Connection closed by authenticating user root 159.223.236.81 port 37298 [preauth]
Jan 26 08:43:38 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:43:38.542 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:43:41 compute-1 podman[211878]: 2026-01-26 08:43:41.789182897 +0000 UTC m=+0.060156793 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 08:43:41 compute-1 podman[211879]: 2026-01-26 08:43:41.800914467 +0000 UTC m=+0.066091070 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible)
Jan 26 08:43:42 compute-1 sshd-session[211916]: Accepted publickey for zuul from 38.102.83.66 port 43320 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:43:42 compute-1 systemd-logind[788]: New session 28 of user zuul.
Jan 26 08:43:42 compute-1 systemd[1]: Started Session 28 of User zuul.
Jan 26 08:43:42 compute-1 sshd-session[211916]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:43:42 compute-1 sshd-session[211919]: Accepted publickey for zuul from 38.102.83.66 port 43324 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:43:42 compute-1 systemd-logind[788]: New session 29 of user zuul.
Jan 26 08:43:42 compute-1 systemd[1]: Started Session 29 of User zuul.
Jan 26 08:43:42 compute-1 sshd-session[211919]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:43:42 compute-1 sshd-session[211946]: Accepted publickey for zuul from 38.102.83.66 port 43334 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:43:42 compute-1 systemd-logind[788]: New session 30 of user zuul.
Jan 26 08:43:42 compute-1 sshd-session[211921]: Connection closed by 38.102.83.66 port 43320
Jan 26 08:43:42 compute-1 sshd-session[211916]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:43:42 compute-1 systemd[1]: Started Session 30 of User zuul.
Jan 26 08:43:42 compute-1 systemd[1]: session-28.scope: Deactivated successfully.
Jan 26 08:43:42 compute-1 systemd-logind[788]: Session 28 logged out. Waiting for processes to exit.
Jan 26 08:43:42 compute-1 sshd-session[211946]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:43:42 compute-1 systemd-logind[788]: Removed session 28.
Jan 26 08:43:42 compute-1 sshd-session[211956]: Accepted publickey for zuul from 38.102.83.66 port 43338 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:43:42 compute-1 sshd-session[211947]: Connection closed by 38.102.83.66 port 43324
Jan 26 08:43:42 compute-1 sshd-session[211919]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:43:42 compute-1 systemd-logind[788]: New session 31 of user zuul.
Jan 26 08:43:42 compute-1 systemd[1]: Started Session 31 of User zuul.
Jan 26 08:43:42 compute-1 systemd[1]: session-29.scope: Deactivated successfully.
Jan 26 08:43:42 compute-1 systemd-logind[788]: Session 29 logged out. Waiting for processes to exit.
Jan 26 08:43:42 compute-1 sshd-session[211956]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:43:42 compute-1 systemd-logind[788]: Removed session 29.
Jan 26 08:43:43 compute-1 sshd-session[211975]: Connection closed by 38.102.83.66 port 43334
Jan 26 08:43:43 compute-1 sshd-session[211946]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:43:43 compute-1 systemd[1]: session-30.scope: Deactivated successfully.
Jan 26 08:43:43 compute-1 systemd-logind[788]: Session 30 logged out. Waiting for processes to exit.
Jan 26 08:43:43 compute-1 systemd-logind[788]: Removed session 30.
Jan 26 08:43:43 compute-1 sshd-session[212000]: Connection closed by 38.102.83.66 port 43338
Jan 26 08:43:43 compute-1 sshd-session[211956]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:43:43 compute-1 systemd[1]: session-31.scope: Deactivated successfully.
Jan 26 08:43:43 compute-1 systemd-logind[788]: Session 31 logged out. Waiting for processes to exit.
Jan 26 08:43:43 compute-1 systemd-logind[788]: Removed session 31.
Jan 26 08:43:48 compute-1 podman[212025]: 2026-01-26 08:43:48.81678045 +0000 UTC m=+0.075486733 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 08:43:48 compute-1 podman[212024]: 2026-01-26 08:43:48.896334456 +0000 UTC m=+0.159482844 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Jan 26 08:43:50 compute-1 podman[212073]: 2026-01-26 08:43:50.834354644 +0000 UTC m=+0.090885655 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 08:43:53 compute-1 nova_compute[183083]: 2026-01-26 08:43:53.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.023 183087 DEBUG oslo_concurrency.lockutils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquiring lock "f48ffd58-106e-469e-bbbb-7dc61534fc34" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.023 183087 DEBUG oslo_concurrency.lockutils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "f48ffd58-106e-469e-bbbb-7dc61534fc34" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.040 183087 DEBUG nova.compute.manager [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.123 183087 DEBUG oslo_concurrency.lockutils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.124 183087 DEBUG oslo_concurrency.lockutils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.134 183087 DEBUG nova.virt.hardware [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.135 183087 INFO nova.compute.claims [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.252 183087 DEBUG nova.compute.provider_tree [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.265 183087 DEBUG nova.scheduler.client.report [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.287 183087 DEBUG oslo_concurrency.lockutils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.288 183087 DEBUG nova.compute.manager [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.333 183087 DEBUG nova.compute.manager [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.334 183087 DEBUG nova.network.neutron [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.361 183087 INFO nova.virt.libvirt.driver [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.384 183087 DEBUG nova.compute.manager [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.486 183087 DEBUG nova.compute.manager [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.489 183087 DEBUG nova.virt.libvirt.driver [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.489 183087 INFO nova.virt.libvirt.driver [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Creating image(s)
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.491 183087 DEBUG oslo_concurrency.lockutils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquiring lock "/var/lib/nova/instances/f48ffd58-106e-469e-bbbb-7dc61534fc34/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.492 183087 DEBUG oslo_concurrency.lockutils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "/var/lib/nova/instances/f48ffd58-106e-469e-bbbb-7dc61534fc34/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.493 183087 DEBUG oslo_concurrency.lockutils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "/var/lib/nova/instances/f48ffd58-106e-469e-bbbb-7dc61534fc34/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.494 183087 DEBUG oslo_concurrency.lockutils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.495 183087 DEBUG oslo_concurrency.lockutils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:43:55 compute-1 nova_compute[183083]: 2026-01-26 08:43:55.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.511 183087 WARNING oslo_policy.policy [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.511 183087 WARNING oslo_policy.policy [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.516 183087 DEBUG nova.policy [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '64bdc9f771e449a8930bafc62d000e64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e482dc8c944c4dc1ba301e69d00ec101', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.865 183087 DEBUG oslo_concurrency.lockutils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.370s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Traceback (most recent call last):
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]     raise exception.ImageUnacceptable(
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] 
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] During handling of the above exception, another exception occurred:
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] 
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Traceback (most recent call last):
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]     yield resources
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]     created_disks = self._create_and_inject_local_root(
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]     image.cache(fetch_func=fetch_func,
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]     return f(*args, **kwargs)
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34]     raise exception.ImageUnacceptable(
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.866 183087 ERROR nova.compute.manager [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] 
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.948 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.990 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.991 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 08:43:56 compute-1 nova_compute[183083]: 2026-01-26 08:43:56.993 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:43:57 compute-1 nova_compute[183083]: 2026-01-26 08:43:57.014 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:43:57 compute-1 nova_compute[183083]: 2026-01-26 08:43:57.014 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:43:57 compute-1 nova_compute[183083]: 2026-01-26 08:43:57.015 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:43:57 compute-1 nova_compute[183083]: 2026-01-26 08:43:57.015 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:43:57 compute-1 nova_compute[183083]: 2026-01-26 08:43:57.223 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:43:57 compute-1 nova_compute[183083]: 2026-01-26 08:43:57.224 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=14124MB free_disk=113.13611221313477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:43:57 compute-1 nova_compute[183083]: 2026-01-26 08:43:57.225 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:43:57 compute-1 nova_compute[183083]: 2026-01-26 08:43:57.225 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:43:57 compute-1 nova_compute[183083]: 2026-01-26 08:43:57.320 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance f48ffd58-106e-469e-bbbb-7dc61534fc34 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 08:43:57 compute-1 nova_compute[183083]: 2026-01-26 08:43:57.321 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:43:57 compute-1 nova_compute[183083]: 2026-01-26 08:43:57.321 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:43:57 compute-1 nova_compute[183083]: 2026-01-26 08:43:57.371 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:43:57 compute-1 nova_compute[183083]: 2026-01-26 08:43:57.394 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:43:57 compute-1 nova_compute[183083]: 2026-01-26 08:43:57.423 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:43:57 compute-1 nova_compute[183083]: 2026-01-26 08:43:57.424 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:43:58 compute-1 nova_compute[183083]: 2026-01-26 08:43:58.894 183087 DEBUG nova.network.neutron [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Successfully updated port: 7b71f06f-9a8f-47ea-a70f-9b837a9823ad _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:43:58 compute-1 nova_compute[183083]: 2026-01-26 08:43:58.907 183087 DEBUG oslo_concurrency.lockutils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquiring lock "refresh_cache-f48ffd58-106e-469e-bbbb-7dc61534fc34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:43:58 compute-1 nova_compute[183083]: 2026-01-26 08:43:58.908 183087 DEBUG oslo_concurrency.lockutils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquired lock "refresh_cache-f48ffd58-106e-469e-bbbb-7dc61534fc34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:43:58 compute-1 nova_compute[183083]: 2026-01-26 08:43:58.908 183087 DEBUG nova.network.neutron [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:43:59 compute-1 nova_compute[183083]: 2026-01-26 08:43:59.382 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:43:59 compute-1 nova_compute[183083]: 2026-01-26 08:43:59.383 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:43:59 compute-1 nova_compute[183083]: 2026-01-26 08:43:59.719 183087 DEBUG nova.network.neutron [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:43:59 compute-1 nova_compute[183083]: 2026-01-26 08:43:59.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:44:00 compute-1 nova_compute[183083]: 2026-01-26 08:44:00.283 183087 DEBUG nova.compute.manager [req-56fa18b5-6fff-420b-b0c1-753eeaa0c0ea req-84165b80-5a7c-4279-887c-e75356ba0bf5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Received event network-changed-7b71f06f-9a8f-47ea-a70f-9b837a9823ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:44:00 compute-1 nova_compute[183083]: 2026-01-26 08:44:00.284 183087 DEBUG nova.compute.manager [req-56fa18b5-6fff-420b-b0c1-753eeaa0c0ea req-84165b80-5a7c-4279-887c-e75356ba0bf5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Refreshing instance network info cache due to event network-changed-7b71f06f-9a8f-47ea-a70f-9b837a9823ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:44:00 compute-1 nova_compute[183083]: 2026-01-26 08:44:00.285 183087 DEBUG oslo_concurrency.lockutils [req-56fa18b5-6fff-420b-b0c1-753eeaa0c0ea req-84165b80-5a7c-4279-887c-e75356ba0bf5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-f48ffd58-106e-469e-bbbb-7dc61534fc34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:44:00 compute-1 podman[212092]: 2026-01-26 08:44:00.823712952 +0000 UTC m=+0.082674435 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.420 183087 DEBUG nova.network.neutron [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Updating instance_info_cache with network_info: [{"id": "7b71f06f-9a8f-47ea-a70f-9b837a9823ad", "address": "fa:16:3e:03:e9:54", "network": {"id": "36f73b4d-8c08-47d9-bfaa-1e42149ec930", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.148", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001::/64", "dns": [], "gateway": {"address": "2001::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001::3b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b71f06f-9a", "ovs_interfaceid": "7b71f06f-9a8f-47ea-a70f-9b837a9823ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.447 183087 DEBUG oslo_concurrency.lockutils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Releasing lock "refresh_cache-f48ffd58-106e-469e-bbbb-7dc61534fc34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.447 183087 DEBUG nova.compute.manager [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Instance network_info: |[{"id": "7b71f06f-9a8f-47ea-a70f-9b837a9823ad", "address": "fa:16:3e:03:e9:54", "network": {"id": "36f73b4d-8c08-47d9-bfaa-1e42149ec930", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.148", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001::/64", "dns": [], "gateway": {"address": "2001::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001::3b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b71f06f-9a", "ovs_interfaceid": "7b71f06f-9a8f-47ea-a70f-9b837a9823ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.448 183087 DEBUG oslo_concurrency.lockutils [req-56fa18b5-6fff-420b-b0c1-753eeaa0c0ea req-84165b80-5a7c-4279-887c-e75356ba0bf5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-f48ffd58-106e-469e-bbbb-7dc61534fc34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.449 183087 DEBUG nova.network.neutron [req-56fa18b5-6fff-420b-b0c1-753eeaa0c0ea req-84165b80-5a7c-4279-887c-e75356ba0bf5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Refreshing network info cache for port 7b71f06f-9a8f-47ea-a70f-9b837a9823ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.450 183087 INFO nova.compute.manager [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Terminating instance
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.452 183087 DEBUG nova.compute.manager [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.457 183087 DEBUG nova.virt.libvirt.driver [-] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.457 183087 INFO nova.virt.libvirt.driver [-] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Instance destroyed successfully.
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.459 183087 DEBUG nova.virt.libvirt.vif [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:43:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_disabled_dhcp6',display_name='tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_disabled_dhcp6',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-extradhcpoptionstest-1086481355-test-extra-dhcp-opts-di',id=2,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDbcjoHylAQtqg1Vl94jB8Y6+G/LV2rXvLmP1GPnqiVf11BnABlWcWIGa5wBxNzQ5L6qFsQZodIPzUEOuiX2g/H28q0JdYWcV+hRwWSFfpY7UxCYzUgdJhIE18mhMw36ag==',key_name='tempest-ExtraDhcpOptionsTest-1086481355',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e482dc8c944c4dc1ba301e69d00ec101',ramdisk_id='',reservation_id='r-il536pjf',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ExtraDhcpOptionsTest-992702786',owner_user_name='tempest-ExtraDhcpOptionsTest-992702786-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:43:55Z,user_data=None,user_id='64bdc9f771e449a8930bafc62d000e64',uuid=f48ffd58-106e-469e-bbbb-7dc61534fc34,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b71f06f-9a8f-47ea-a70f-9b837a9823ad", "address": "fa:16:3e:03:e9:54", "network": {"id": "36f73b4d-8c08-47d9-bfaa-1e42149ec930", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.148", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001::/64", "dns": [], "gateway": {"address": "2001::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001::3b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b71f06f-9a", "ovs_interfaceid": "7b71f06f-9a8f-47ea-a70f-9b837a9823ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.460 183087 DEBUG nova.network.os_vif_util [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Converting VIF {"id": "7b71f06f-9a8f-47ea-a70f-9b837a9823ad", "address": "fa:16:3e:03:e9:54", "network": {"id": "36f73b4d-8c08-47d9-bfaa-1e42149ec930", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.148", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001::/64", "dns": [], "gateway": {"address": "2001::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001::3b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b71f06f-9a", "ovs_interfaceid": "7b71f06f-9a8f-47ea-a70f-9b837a9823ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.461 183087 DEBUG nova.network.os_vif_util [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:e9:54,bridge_name='br-int',has_traffic_filtering=True,id=7b71f06f-9a8f-47ea-a70f-9b837a9823ad,network=Network(36f73b4d-8c08-47d9-bfaa-1e42149ec930),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b71f06f-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.462 183087 DEBUG os_vif [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:e9:54,bridge_name='br-int',has_traffic_filtering=True,id=7b71f06f-9a8f-47ea-a70f-9b837a9823ad,network=Network(36f73b4d-8c08-47d9-bfaa-1e42149ec930),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b71f06f-9a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.560 183087 DEBUG ovsdbapp.backend.ovs_idl [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.560 183087 DEBUG ovsdbapp.backend.ovs_idl [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.561 183087 DEBUG ovsdbapp.backend.ovs_idl [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.561 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.562 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [POLLOUT] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.562 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.562 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.564 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.567 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.579 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.580 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b71f06f-9a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.580 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:44:01 compute-1 nova_compute[183083]: 2026-01-26 08:44:01.581 183087 INFO oslo.privsep.daemon [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpf6g5dmnx/privsep.sock']
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.278 183087 INFO oslo.privsep.daemon [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Spawned new privsep daemon via rootwrap
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.148 212120 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.154 212120 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.158 212120 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.159 212120 INFO oslo.privsep.daemon [-] privsep daemon running as pid 212120
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.604 183087 INFO os_vif [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:e9:54,bridge_name='br-int',has_traffic_filtering=True,id=7b71f06f-9a8f-47ea-a70f-9b837a9823ad,network=Network(36f73b4d-8c08-47d9-bfaa-1e42149ec930),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b71f06f-9a')
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.605 183087 INFO nova.virt.libvirt.driver [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Deleting instance files /var/lib/nova/instances/f48ffd58-106e-469e-bbbb-7dc61534fc34_del
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.606 183087 INFO nova.virt.libvirt.driver [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Deletion of /var/lib/nova/instances/f48ffd58-106e-469e-bbbb-7dc61534fc34_del complete
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.680 183087 INFO nova.compute.manager [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Took 1.23 seconds to destroy the instance on the hypervisor.
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.683 183087 DEBUG nova.compute.claims [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Aborting claim: <nova.compute.claims.Claim object at 0x7f6cb8755e80> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.684 183087 DEBUG oslo_concurrency.lockutils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.684 183087 DEBUG oslo_concurrency.lockutils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.786 183087 DEBUG nova.compute.provider_tree [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.803 183087 DEBUG nova.scheduler.client.report [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.865 183087 DEBUG oslo_concurrency.lockutils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.866 183087 DEBUG nova.compute.utils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.867 183087 ERROR nova.compute.manager [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Build of instance f48ffd58-106e-469e-bbbb-7dc61534fc34 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance f48ffd58-106e-469e-bbbb-7dc61534fc34 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.868 183087 DEBUG nova.compute.manager [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.869 183087 DEBUG nova.virt.libvirt.vif [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:43:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_disabled_dhcp6',display_name='tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_disabled_dhcp6',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-extradhcpoptionstest-1086481355-test-extra-dhcp-opts-di',id=2,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDbcjoHylAQtqg1Vl94jB8Y6+G/LV2rXvLmP1GPnqiVf11BnABlWcWIGa5wBxNzQ5L6qFsQZodIPzUEOuiX2g/H28q0JdYWcV+hRwWSFfpY7UxCYzUgdJhIE18mhMw36ag==',key_name='tempest-ExtraDhcpOptionsTest-1086481355',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e482dc8c944c4dc1ba301e69d00ec101',ramdisk_id='',reservation_id='r-il536pjf',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ExtraDhcpOptionsTest-992702786',owner_user_name='tempest-ExtraDhcpOptionsTest-992702786-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:44:02Z,user_data=None,user_id='64bdc9f771e449a8930bafc62d000e64',uuid=f48ffd58-106e-469e-bbbb-7dc61534fc34,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b71f06f-9a8f-47ea-a70f-9b837a9823ad", "address": "fa:16:3e:03:e9:54", "network": {"id": "36f73b4d-8c08-47d9-bfaa-1e42149ec930", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.148", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001::/64", "dns": [], "gateway": {"address": "2001::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001::3b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b71f06f-9a", "ovs_interfaceid": "7b71f06f-9a8f-47ea-a70f-9b837a9823ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.870 183087 DEBUG nova.network.os_vif_util [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Converting VIF {"id": "7b71f06f-9a8f-47ea-a70f-9b837a9823ad", "address": "fa:16:3e:03:e9:54", "network": {"id": "36f73b4d-8c08-47d9-bfaa-1e42149ec930", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.148", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001::/64", "dns": [], "gateway": {"address": "2001::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001::3b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b71f06f-9a", "ovs_interfaceid": "7b71f06f-9a8f-47ea-a70f-9b837a9823ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.872 183087 DEBUG nova.network.os_vif_util [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:e9:54,bridge_name='br-int',has_traffic_filtering=True,id=7b71f06f-9a8f-47ea-a70f-9b837a9823ad,network=Network(36f73b4d-8c08-47d9-bfaa-1e42149ec930),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b71f06f-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.872 183087 DEBUG os_vif [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:e9:54,bridge_name='br-int',has_traffic_filtering=True,id=7b71f06f-9a8f-47ea-a70f-9b837a9823ad,network=Network(36f73b4d-8c08-47d9-bfaa-1e42149ec930),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b71f06f-9a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.875 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.875 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b71f06f-9a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.876 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.879 183087 INFO os_vif [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:e9:54,bridge_name='br-int',has_traffic_filtering=True,id=7b71f06f-9a8f-47ea-a70f-9b837a9823ad,network=Network(36f73b4d-8c08-47d9-bfaa-1e42149ec930),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7b71f06f-9a')
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.880 183087 DEBUG nova.compute.manager [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.880 183087 DEBUG nova.compute.manager [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.881 183087 DEBUG nova.network.neutron [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.998 183087 DEBUG nova.network.neutron [req-56fa18b5-6fff-420b-b0c1-753eeaa0c0ea req-84165b80-5a7c-4279-887c-e75356ba0bf5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Updated VIF entry in instance network info cache for port 7b71f06f-9a8f-47ea-a70f-9b837a9823ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:44:02 compute-1 nova_compute[183083]: 2026-01-26 08:44:02.998 183087 DEBUG nova.network.neutron [req-56fa18b5-6fff-420b-b0c1-753eeaa0c0ea req-84165b80-5a7c-4279-887c-e75356ba0bf5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Updating instance_info_cache with network_info: [{"id": "7b71f06f-9a8f-47ea-a70f-9b837a9823ad", "address": "fa:16:3e:03:e9:54", "network": {"id": "36f73b4d-8c08-47d9-bfaa-1e42149ec930", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.148", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001::/64", "dns": [], "gateway": {"address": "2001::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001::3b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b71f06f-9a", "ovs_interfaceid": "7b71f06f-9a8f-47ea-a70f-9b837a9823ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:44:03 compute-1 nova_compute[183083]: 2026-01-26 08:44:03.025 183087 DEBUG oslo_concurrency.lockutils [req-56fa18b5-6fff-420b-b0c1-753eeaa0c0ea req-84165b80-5a7c-4279-887c-e75356ba0bf5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-f48ffd58-106e-469e-bbbb-7dc61534fc34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.745 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:44:03.745 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:44:03 compute-1 ovn_controller[95352]: 2026-01-26T08:44:03Z|00036|pinctrl|WARN|Dropped 1723 log messages in last 60 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 26 08:44:03 compute-1 ovn_controller[95352]: 2026-01-26T08:44:03Z|00037|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:44:03 compute-1 nova_compute[183083]: 2026-01-26 08:44:03.952 183087 DEBUG nova.network.neutron [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:44:03 compute-1 nova_compute[183083]: 2026-01-26 08:44:03.976 183087 INFO nova.compute.manager [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: f48ffd58-106e-469e-bbbb-7dc61534fc34] Took 1.09 seconds to deallocate network for instance.
Jan 26 08:44:04 compute-1 nova_compute[183083]: 2026-01-26 08:44:04.163 183087 INFO nova.scheduler.client.report [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Deleted allocations for instance f48ffd58-106e-469e-bbbb-7dc61534fc34
Jan 26 08:44:04 compute-1 nova_compute[183083]: 2026-01-26 08:44:04.165 183087 DEBUG oslo_concurrency.lockutils [None req-ae6b11e5-c03e-4f13-9241-8fcb8b9dcdad 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "f48ffd58-106e-469e-bbbb-7dc61534fc34" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:04 compute-1 nova_compute[183083]: 2026-01-26 08:44:04.274 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:05.295 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:05.296 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:05.296 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:06 compute-1 nova_compute[183083]: 2026-01-26 08:44:06.569 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.165 183087 DEBUG oslo_concurrency.lockutils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Acquiring lock "3abeeab5-4d03-4a66-a84e-5c5bc6280bef" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.166 183087 DEBUG oslo_concurrency.lockutils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Lock "3abeeab5-4d03-4a66-a84e-5c5bc6280bef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.181 183087 DEBUG nova.compute.manager [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.261 183087 DEBUG oslo_concurrency.lockutils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.262 183087 DEBUG oslo_concurrency.lockutils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.272 183087 DEBUG nova.virt.hardware [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.273 183087 INFO nova.compute.claims [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.384 183087 DEBUG nova.compute.provider_tree [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.396 183087 DEBUG nova.scheduler.client.report [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.412 183087 DEBUG oslo_concurrency.lockutils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.413 183087 DEBUG nova.compute.manager [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.452 183087 DEBUG nova.compute.manager [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.452 183087 DEBUG nova.network.neutron [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.472 183087 INFO nova.virt.libvirt.driver [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.490 183087 DEBUG nova.compute.manager [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.590 183087 DEBUG nova.compute.manager [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.592 183087 DEBUG nova.virt.libvirt.driver [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.593 183087 INFO nova.virt.libvirt.driver [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Creating image(s)
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.594 183087 DEBUG oslo_concurrency.lockutils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Acquiring lock "/var/lib/nova/instances/3abeeab5-4d03-4a66-a84e-5c5bc6280bef/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.595 183087 DEBUG oslo_concurrency.lockutils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Lock "/var/lib/nova/instances/3abeeab5-4d03-4a66-a84e-5c5bc6280bef/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.596 183087 DEBUG oslo_concurrency.lockutils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Lock "/var/lib/nova/instances/3abeeab5-4d03-4a66-a84e-5c5bc6280bef/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.597 183087 DEBUG oslo_concurrency.lockutils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.597 183087 DEBUG oslo_concurrency.lockutils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:07 compute-1 nova_compute[183083]: 2026-01-26 08:44:07.724 183087 DEBUG nova.policy [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a9b91882390a40bc836cbabfbc0c4f95', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1addc70e165144c7b6438517cd7d5b47', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.567 183087 DEBUG nova.network.neutron [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Successfully created port: 4af39105-c745-43ec-ab62-e63c323c349b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 DEBUG oslo_concurrency.lockutils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Traceback (most recent call last):
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]     raise exception.ImageUnacceptable(
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] 
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] During handling of the above exception, another exception occurred:
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] 
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Traceback (most recent call last):
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]     yield resources
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]     created_disks = self._create_and_inject_local_root(
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]     image.cache(fetch_func=fetch_func,
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]     return f(*args, **kwargs)
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef]     raise exception.ImageUnacceptable(
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:44:08 compute-1 nova_compute[183083]: 2026-01-26 08:44:08.636 183087 ERROR nova.compute.manager [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] 
Jan 26 08:44:09 compute-1 nova_compute[183083]: 2026-01-26 08:44:09.307 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:10 compute-1 nova_compute[183083]: 2026-01-26 08:44:10.017 183087 DEBUG nova.network.neutron [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Successfully updated port: 4af39105-c745-43ec-ab62-e63c323c349b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:44:10 compute-1 nova_compute[183083]: 2026-01-26 08:44:10.035 183087 DEBUG oslo_concurrency.lockutils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Acquiring lock "refresh_cache-3abeeab5-4d03-4a66-a84e-5c5bc6280bef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:44:10 compute-1 nova_compute[183083]: 2026-01-26 08:44:10.035 183087 DEBUG oslo_concurrency.lockutils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Acquired lock "refresh_cache-3abeeab5-4d03-4a66-a84e-5c5bc6280bef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:44:10 compute-1 nova_compute[183083]: 2026-01-26 08:44:10.036 183087 DEBUG nova.network.neutron [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:44:10 compute-1 nova_compute[183083]: 2026-01-26 08:44:10.345 183087 DEBUG nova.network.neutron [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:44:11 compute-1 nova_compute[183083]: 2026-01-26 08:44:11.443 183087 DEBUG nova.compute.manager [req-acdca18f-7965-46cf-ab97-16e115d904c7 req-68dfc264-8387-4cd6-89b4-3e0b722a65ff 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Received event network-changed-4af39105-c745-43ec-ab62-e63c323c349b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:44:11 compute-1 nova_compute[183083]: 2026-01-26 08:44:11.444 183087 DEBUG nova.compute.manager [req-acdca18f-7965-46cf-ab97-16e115d904c7 req-68dfc264-8387-4cd6-89b4-3e0b722a65ff 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Refreshing instance network info cache due to event network-changed-4af39105-c745-43ec-ab62-e63c323c349b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:44:11 compute-1 nova_compute[183083]: 2026-01-26 08:44:11.444 183087 DEBUG oslo_concurrency.lockutils [req-acdca18f-7965-46cf-ab97-16e115d904c7 req-68dfc264-8387-4cd6-89b4-3e0b722a65ff 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-3abeeab5-4d03-4a66-a84e-5c5bc6280bef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:44:11 compute-1 nova_compute[183083]: 2026-01-26 08:44:11.573 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.028 183087 DEBUG nova.network.neutron [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Updating instance_info_cache with network_info: [{"id": "4af39105-c745-43ec-ab62-e63c323c349b", "address": "fa:16:3e:b4:a5:36", "network": {"id": "b0ba5496-10bf-4202-81f7-a7fd0141b5de", "bridge": "br-int", "label": "tempest-test-network--12205755", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addc70e165144c7b6438517cd7d5b47", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4af39105-c7", "ovs_interfaceid": "4af39105-c745-43ec-ab62-e63c323c349b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.050 183087 DEBUG oslo_concurrency.lockutils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Releasing lock "refresh_cache-3abeeab5-4d03-4a66-a84e-5c5bc6280bef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.051 183087 DEBUG nova.compute.manager [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Instance network_info: |[{"id": "4af39105-c745-43ec-ab62-e63c323c349b", "address": "fa:16:3e:b4:a5:36", "network": {"id": "b0ba5496-10bf-4202-81f7-a7fd0141b5de", "bridge": "br-int", "label": "tempest-test-network--12205755", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addc70e165144c7b6438517cd7d5b47", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4af39105-c7", "ovs_interfaceid": "4af39105-c745-43ec-ab62-e63c323c349b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.051 183087 DEBUG oslo_concurrency.lockutils [req-acdca18f-7965-46cf-ab97-16e115d904c7 req-68dfc264-8387-4cd6-89b4-3e0b722a65ff 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-3abeeab5-4d03-4a66-a84e-5c5bc6280bef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.052 183087 DEBUG nova.network.neutron [req-acdca18f-7965-46cf-ab97-16e115d904c7 req-68dfc264-8387-4cd6-89b4-3e0b722a65ff 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Refreshing network info cache for port 4af39105-c745-43ec-ab62-e63c323c349b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.056 183087 INFO nova.compute.manager [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Terminating instance
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.058 183087 DEBUG nova.compute.manager [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.064 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.064 183087 INFO nova.virt.libvirt.driver [-] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Instance destroyed successfully.
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.066 183087 DEBUG nova.virt.libvirt.vif [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:44:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-broadcast-sender-370346091',display_name='tempest-broadcast-sender-370346091',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-broadcast-sender-370346091',id=3,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCjptb8cevol70wF6KOQe0EUMh7PA0Cs2q6xlyGkW5Cd2/ChGxpmoxyl60e7fN1sWYKA1iq04CYfUgs4tDrdsY2+WMKZSrvrRoyXN2BZavxGvXPnxjDo8bTKcsV4c4GrvA==',key_name='tempest-keypair-test-1588346826',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1addc70e165144c7b6438517cd7d5b47',ramdisk_id='',reservation_id='r-arcaugtk',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-BroadcastTestIPv4Common-558663779',owner_user_name='tempest-BroadcastTestIPv4Common-558663779-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:44:07Z,user_data=None,user_id='a9b91882390a40bc836cbabfbc0c4f95',uuid=3abeeab5-4d03-4a66-a84e-5c5bc6280bef,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4af39105-c745-43ec-ab62-e63c323c349b", "address": "fa:16:3e:b4:a5:36", "network": {"id": "b0ba5496-10bf-4202-81f7-a7fd0141b5de", "bridge": "br-int", "label": "tempest-test-network--12205755", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addc70e165144c7b6438517cd7d5b47", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4af39105-c7", "ovs_interfaceid": "4af39105-c745-43ec-ab62-e63c323c349b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.067 183087 DEBUG nova.network.os_vif_util [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Converting VIF {"id": "4af39105-c745-43ec-ab62-e63c323c349b", "address": "fa:16:3e:b4:a5:36", "network": {"id": "b0ba5496-10bf-4202-81f7-a7fd0141b5de", "bridge": "br-int", "label": "tempest-test-network--12205755", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addc70e165144c7b6438517cd7d5b47", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4af39105-c7", "ovs_interfaceid": "4af39105-c745-43ec-ab62-e63c323c349b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.068 183087 DEBUG nova.network.os_vif_util [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a5:36,bridge_name='br-int',has_traffic_filtering=True,id=4af39105-c745-43ec-ab62-e63c323c349b,network=Network(b0ba5496-10bf-4202-81f7-a7fd0141b5de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4af39105-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.068 183087 DEBUG os_vif [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a5:36,bridge_name='br-int',has_traffic_filtering=True,id=4af39105-c745-43ec-ab62-e63c323c349b,network=Network(b0ba5496-10bf-4202-81f7-a7fd0141b5de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4af39105-c7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.071 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.071 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4af39105-c7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.072 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.080 183087 INFO os_vif [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a5:36,bridge_name='br-int',has_traffic_filtering=True,id=4af39105-c745-43ec-ab62-e63c323c349b,network=Network(b0ba5496-10bf-4202-81f7-a7fd0141b5de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4af39105-c7')
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.080 183087 INFO nova.virt.libvirt.driver [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Deleting instance files /var/lib/nova/instances/3abeeab5-4d03-4a66-a84e-5c5bc6280bef_del
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.081 183087 INFO nova.virt.libvirt.driver [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Deletion of /var/lib/nova/instances/3abeeab5-4d03-4a66-a84e-5c5bc6280bef_del complete
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.183 183087 INFO nova.compute.manager [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Took 0.12 seconds to destroy the instance on the hypervisor.
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.184 183087 DEBUG nova.compute.claims [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c985ef880> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.185 183087 DEBUG oslo_concurrency.lockutils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.185 183087 DEBUG oslo_concurrency.lockutils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.296 183087 DEBUG nova.compute.provider_tree [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.315 183087 DEBUG nova.scheduler.client.report [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.339 183087 DEBUG oslo_concurrency.lockutils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.341 183087 DEBUG nova.compute.utils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.342 183087 ERROR nova.compute.manager [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Build of instance 3abeeab5-4d03-4a66-a84e-5c5bc6280bef aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 3abeeab5-4d03-4a66-a84e-5c5bc6280bef aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.343 183087 DEBUG nova.compute.manager [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.344 183087 DEBUG nova.virt.libvirt.vif [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:44:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-broadcast-sender-370346091',display_name='tempest-broadcast-sender-370346091',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-broadcast-sender-370346091',id=3,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCjptb8cevol70wF6KOQe0EUMh7PA0Cs2q6xlyGkW5Cd2/ChGxpmoxyl60e7fN1sWYKA1iq04CYfUgs4tDrdsY2+WMKZSrvrRoyXN2BZavxGvXPnxjDo8bTKcsV4c4GrvA==',key_name='tempest-keypair-test-1588346826',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1addc70e165144c7b6438517cd7d5b47',ramdisk_id='',reservation_id='r-arcaugtk',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-BroadcastTestIPv4Common-558663779',owner_user_name='tempest-BroadcastTestIPv4Common-558663779-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:44:12Z,user_data=None,user_id='a9b91882390a40bc836cbabfbc0c4f95',uuid=3abeeab5-4d03-4a66-a84e-5c5bc6280bef,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4af39105-c745-43ec-ab62-e63c323c349b", "address": "fa:16:3e:b4:a5:36", "network": {"id": "b0ba5496-10bf-4202-81f7-a7fd0141b5de", "bridge": "br-int", "label": "tempest-test-network--12205755", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addc70e165144c7b6438517cd7d5b47", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4af39105-c7", "ovs_interfaceid": "4af39105-c745-43ec-ab62-e63c323c349b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.344 183087 DEBUG nova.network.os_vif_util [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Converting VIF {"id": "4af39105-c745-43ec-ab62-e63c323c349b", "address": "fa:16:3e:b4:a5:36", "network": {"id": "b0ba5496-10bf-4202-81f7-a7fd0141b5de", "bridge": "br-int", "label": "tempest-test-network--12205755", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addc70e165144c7b6438517cd7d5b47", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4af39105-c7", "ovs_interfaceid": "4af39105-c745-43ec-ab62-e63c323c349b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.345 183087 DEBUG nova.network.os_vif_util [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a5:36,bridge_name='br-int',has_traffic_filtering=True,id=4af39105-c745-43ec-ab62-e63c323c349b,network=Network(b0ba5496-10bf-4202-81f7-a7fd0141b5de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4af39105-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.346 183087 DEBUG os_vif [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a5:36,bridge_name='br-int',has_traffic_filtering=True,id=4af39105-c745-43ec-ab62-e63c323c349b,network=Network(b0ba5496-10bf-4202-81f7-a7fd0141b5de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4af39105-c7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.348 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.349 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4af39105-c7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.349 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.357 183087 INFO os_vif [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a5:36,bridge_name='br-int',has_traffic_filtering=True,id=4af39105-c745-43ec-ab62-e63c323c349b,network=Network(b0ba5496-10bf-4202-81f7-a7fd0141b5de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4af39105-c7')
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.358 183087 DEBUG nova.compute.manager [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.358 183087 DEBUG nova.compute.manager [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:44:12 compute-1 nova_compute[183083]: 2026-01-26 08:44:12.359 183087 DEBUG nova.network.neutron [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:44:12 compute-1 podman[212124]: 2026-01-26 08:44:12.832893457 +0000 UTC m=+0.093222653 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 08:44:12 compute-1 podman[212125]: 2026-01-26 08:44:12.836361875 +0000 UTC m=+0.088891421 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Jan 26 08:44:13 compute-1 nova_compute[183083]: 2026-01-26 08:44:13.717 183087 DEBUG nova.network.neutron [req-acdca18f-7965-46cf-ab97-16e115d904c7 req-68dfc264-8387-4cd6-89b4-3e0b722a65ff 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Updated VIF entry in instance network info cache for port 4af39105-c745-43ec-ab62-e63c323c349b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:44:13 compute-1 nova_compute[183083]: 2026-01-26 08:44:13.718 183087 DEBUG nova.network.neutron [req-acdca18f-7965-46cf-ab97-16e115d904c7 req-68dfc264-8387-4cd6-89b4-3e0b722a65ff 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Updating instance_info_cache with network_info: [{"id": "4af39105-c745-43ec-ab62-e63c323c349b", "address": "fa:16:3e:b4:a5:36", "network": {"id": "b0ba5496-10bf-4202-81f7-a7fd0141b5de", "bridge": "br-int", "label": "tempest-test-network--12205755", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addc70e165144c7b6438517cd7d5b47", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4af39105-c7", "ovs_interfaceid": "4af39105-c745-43ec-ab62-e63c323c349b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:44:13 compute-1 nova_compute[183083]: 2026-01-26 08:44:13.734 183087 DEBUG nova.network.neutron [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:44:13 compute-1 nova_compute[183083]: 2026-01-26 08:44:13.737 183087 DEBUG oslo_concurrency.lockutils [req-acdca18f-7965-46cf-ab97-16e115d904c7 req-68dfc264-8387-4cd6-89b4-3e0b722a65ff 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-3abeeab5-4d03-4a66-a84e-5c5bc6280bef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:44:13 compute-1 nova_compute[183083]: 2026-01-26 08:44:13.761 183087 INFO nova.compute.manager [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] [instance: 3abeeab5-4d03-4a66-a84e-5c5bc6280bef] Took 1.40 seconds to deallocate network for instance.
Jan 26 08:44:13 compute-1 nova_compute[183083]: 2026-01-26 08:44:13.936 183087 INFO nova.scheduler.client.report [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Deleted allocations for instance 3abeeab5-4d03-4a66-a84e-5c5bc6280bef
Jan 26 08:44:13 compute-1 nova_compute[183083]: 2026-01-26 08:44:13.937 183087 DEBUG oslo_concurrency.lockutils [None req-631a6d64-b88e-4a28-97fb-99aa78ab4a3f a9b91882390a40bc836cbabfbc0c4f95 1addc70e165144c7b6438517cd7d5b47 - - default default] Lock "3abeeab5-4d03-4a66-a84e-5c5bc6280bef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.771s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:14 compute-1 nova_compute[183083]: 2026-01-26 08:44:14.308 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:16 compute-1 nova_compute[183083]: 2026-01-26 08:44:16.577 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:19 compute-1 nova_compute[183083]: 2026-01-26 08:44:19.311 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:19 compute-1 podman[212165]: 2026-01-26 08:44:19.839654625 +0000 UTC m=+0.096121363 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 08:44:19 compute-1 podman[212171]: 2026-01-26 08:44:19.850822799 +0000 UTC m=+0.076172542 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 08:44:21 compute-1 nova_compute[183083]: 2026-01-26 08:44:21.582 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:21 compute-1 nova_compute[183083]: 2026-01-26 08:44:21.616 183087 DEBUG oslo_concurrency.lockutils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Acquiring lock "ec9cf498-a940-4975-aa76-6a35fbda4fd7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:21 compute-1 nova_compute[183083]: 2026-01-26 08:44:21.617 183087 DEBUG oslo_concurrency.lockutils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Lock "ec9cf498-a940-4975-aa76-6a35fbda4fd7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:21 compute-1 nova_compute[183083]: 2026-01-26 08:44:21.702 183087 DEBUG nova.compute.manager [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:44:21 compute-1 podman[212213]: 2026-01-26 08:44:21.800993607 +0000 UTC m=+0.066190642 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 26 08:44:21 compute-1 nova_compute[183083]: 2026-01-26 08:44:21.950 183087 DEBUG oslo_concurrency.lockutils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:21 compute-1 nova_compute[183083]: 2026-01-26 08:44:21.950 183087 DEBUG oslo_concurrency.lockutils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:21 compute-1 nova_compute[183083]: 2026-01-26 08:44:21.956 183087 DEBUG nova.virt.hardware [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:44:21 compute-1 nova_compute[183083]: 2026-01-26 08:44:21.957 183087 INFO nova.compute.claims [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:44:22 compute-1 nova_compute[183083]: 2026-01-26 08:44:22.089 183087 DEBUG nova.compute.provider_tree [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:44:22 compute-1 nova_compute[183083]: 2026-01-26 08:44:22.134 183087 DEBUG nova.scheduler.client.report [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:44:22 compute-1 nova_compute[183083]: 2026-01-26 08:44:22.163 183087 DEBUG oslo_concurrency.lockutils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:22 compute-1 nova_compute[183083]: 2026-01-26 08:44:22.164 183087 DEBUG nova.compute.manager [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:44:22 compute-1 nova_compute[183083]: 2026-01-26 08:44:22.242 183087 DEBUG nova.compute.manager [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:44:22 compute-1 nova_compute[183083]: 2026-01-26 08:44:22.243 183087 DEBUG nova.network.neutron [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:44:22 compute-1 nova_compute[183083]: 2026-01-26 08:44:22.282 183087 INFO nova.virt.libvirt.driver [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:44:22 compute-1 nova_compute[183083]: 2026-01-26 08:44:22.322 183087 DEBUG nova.compute.manager [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:44:22 compute-1 nova_compute[183083]: 2026-01-26 08:44:22.446 183087 DEBUG nova.compute.manager [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:44:22 compute-1 nova_compute[183083]: 2026-01-26 08:44:22.448 183087 DEBUG nova.virt.libvirt.driver [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:44:22 compute-1 nova_compute[183083]: 2026-01-26 08:44:22.448 183087 INFO nova.virt.libvirt.driver [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Creating image(s)
Jan 26 08:44:22 compute-1 nova_compute[183083]: 2026-01-26 08:44:22.449 183087 DEBUG oslo_concurrency.lockutils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Acquiring lock "/var/lib/nova/instances/ec9cf498-a940-4975-aa76-6a35fbda4fd7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:22 compute-1 nova_compute[183083]: 2026-01-26 08:44:22.449 183087 DEBUG oslo_concurrency.lockutils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Lock "/var/lib/nova/instances/ec9cf498-a940-4975-aa76-6a35fbda4fd7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:22 compute-1 nova_compute[183083]: 2026-01-26 08:44:22.449 183087 DEBUG oslo_concurrency.lockutils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Lock "/var/lib/nova/instances/ec9cf498-a940-4975-aa76-6a35fbda4fd7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:22 compute-1 nova_compute[183083]: 2026-01-26 08:44:22.450 183087 DEBUG oslo_concurrency.lockutils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:22 compute-1 nova_compute[183083]: 2026-01-26 08:44:22.450 183087 DEBUG oslo_concurrency.lockutils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:22 compute-1 nova_compute[183083]: 2026-01-26 08:44:22.509 183087 DEBUG nova.policy [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7d6129e3a2464eb184d043c6f41a33e4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8b5c686f73574cf6bda27fafd1e6a955', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.477 183087 DEBUG oslo_concurrency.lockutils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Traceback (most recent call last):
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]     raise exception.ImageUnacceptable(
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] 
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] During handling of the above exception, another exception occurred:
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] 
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Traceback (most recent call last):
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]     yield resources
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]     created_disks = self._create_and_inject_local_root(
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]     image.cache(fetch_func=fetch_func,
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]     return f(*args, **kwargs)
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7]     raise exception.ImageUnacceptable(
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:44:23 compute-1 nova_compute[183083]: 2026-01-26 08:44:23.478 183087 ERROR nova.compute.manager [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] 
Jan 26 08:44:24 compute-1 nova_compute[183083]: 2026-01-26 08:44:24.314 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:24 compute-1 nova_compute[183083]: 2026-01-26 08:44:24.388 183087 DEBUG nova.network.neutron [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Successfully created port: ea806abd-8726-49ab-bcf1-7bad4be962d4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:44:26 compute-1 nova_compute[183083]: 2026-01-26 08:44:26.586 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:27 compute-1 nova_compute[183083]: 2026-01-26 08:44:27.934 183087 DEBUG nova.network.neutron [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Successfully updated port: ea806abd-8726-49ab-bcf1-7bad4be962d4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:44:27 compute-1 nova_compute[183083]: 2026-01-26 08:44:27.995 183087 DEBUG oslo_concurrency.lockutils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Acquiring lock "refresh_cache-ec9cf498-a940-4975-aa76-6a35fbda4fd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:44:27 compute-1 nova_compute[183083]: 2026-01-26 08:44:27.995 183087 DEBUG oslo_concurrency.lockutils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Acquired lock "refresh_cache-ec9cf498-a940-4975-aa76-6a35fbda4fd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:44:27 compute-1 nova_compute[183083]: 2026-01-26 08:44:27.996 183087 DEBUG nova.network.neutron [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:44:28 compute-1 nova_compute[183083]: 2026-01-26 08:44:28.871 183087 DEBUG nova.network.neutron [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:44:29 compute-1 nova_compute[183083]: 2026-01-26 08:44:29.315 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.057 183087 DEBUG nova.compute.manager [req-72147dc8-720f-4a5b-9088-77df95c5a7ea req-a5742708-da70-4785-a3be-42687c9b99d1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Received event network-changed-ea806abd-8726-49ab-bcf1-7bad4be962d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.058 183087 DEBUG nova.compute.manager [req-72147dc8-720f-4a5b-9088-77df95c5a7ea req-a5742708-da70-4785-a3be-42687c9b99d1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Refreshing instance network info cache due to event network-changed-ea806abd-8726-49ab-bcf1-7bad4be962d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.059 183087 DEBUG oslo_concurrency.lockutils [req-72147dc8-720f-4a5b-9088-77df95c5a7ea req-a5742708-da70-4785-a3be-42687c9b99d1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-ec9cf498-a940-4975-aa76-6a35fbda4fd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.594 183087 DEBUG nova.network.neutron [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Updating instance_info_cache with network_info: [{"id": "ea806abd-8726-49ab-bcf1-7bad4be962d4", "address": "fa:16:3e:05:7d:fb", "network": {"id": "c70bcd79-0ac3-47dd-befb-792bcecbc13c", "bridge": "br-int", "label": "tempest-test-network--1730423673", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b5c686f73574cf6bda27fafd1e6a955", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea806abd-87", "ovs_interfaceid": "ea806abd-8726-49ab-bcf1-7bad4be962d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.638 183087 DEBUG oslo_concurrency.lockutils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Releasing lock "refresh_cache-ec9cf498-a940-4975-aa76-6a35fbda4fd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.638 183087 DEBUG nova.compute.manager [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Instance network_info: |[{"id": "ea806abd-8726-49ab-bcf1-7bad4be962d4", "address": "fa:16:3e:05:7d:fb", "network": {"id": "c70bcd79-0ac3-47dd-befb-792bcecbc13c", "bridge": "br-int", "label": "tempest-test-network--1730423673", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b5c686f73574cf6bda27fafd1e6a955", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea806abd-87", "ovs_interfaceid": "ea806abd-8726-49ab-bcf1-7bad4be962d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.639 183087 DEBUG oslo_concurrency.lockutils [req-72147dc8-720f-4a5b-9088-77df95c5a7ea req-a5742708-da70-4785-a3be-42687c9b99d1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-ec9cf498-a940-4975-aa76-6a35fbda4fd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.639 183087 DEBUG nova.network.neutron [req-72147dc8-720f-4a5b-9088-77df95c5a7ea req-a5742708-da70-4785-a3be-42687c9b99d1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Refreshing network info cache for port ea806abd-8726-49ab-bcf1-7bad4be962d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.640 183087 INFO nova.compute.manager [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Terminating instance
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.641 183087 DEBUG nova.compute.manager [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.644 183087 DEBUG nova.virt.libvirt.driver [-] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.644 183087 INFO nova.virt.libvirt.driver [-] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Instance destroyed successfully.
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.645 183087 DEBUG nova.virt.libvirt.vif [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:44:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_igmp_snooping_same_network_and_unsubscribe-422610518',display_name='tempest-test_igmp_snooping_same_network_and_unsubscribe-422610518',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-igmp-snooping-same-network-and-unsubscribe-4226105',id=7,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLW3QctowAuTRomDzwX/50c0w4gj7v964zplpIppVscgxp9RpPQOzwEuDyUt6jByTqmBt67G7M/nmKVwfwyD2eB9ia5UAMUkghAJ2GRKx1D906KgHI4AV6DAOYWxJlQb7w==',key_name='tempest-keypair-test-1949113140',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8b5c686f73574cf6bda27fafd1e6a955',ramdisk_id='',reservation_id='r-phzeefni',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Common-937437123',owner_user_name='tempest-MulticastTestIPv4Common-937437123-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:44:22Z,user_data=None,user_id='7d6129e3a2464eb184d043c6f41a33e4',uuid=ec9cf498-a940-4975-aa76-6a35fbda4fd7,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea806abd-8726-49ab-bcf1-7bad4be962d4", "address": "fa:16:3e:05:7d:fb", "network": {"id": "c70bcd79-0ac3-47dd-befb-792bcecbc13c", "bridge": "br-int", "label": "tempest-test-network--1730423673", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b5c686f73574cf6bda27fafd1e6a955", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea806abd-87", "ovs_interfaceid": "ea806abd-8726-49ab-bcf1-7bad4be962d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.645 183087 DEBUG nova.network.os_vif_util [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Converting VIF {"id": "ea806abd-8726-49ab-bcf1-7bad4be962d4", "address": "fa:16:3e:05:7d:fb", "network": {"id": "c70bcd79-0ac3-47dd-befb-792bcecbc13c", "bridge": "br-int", "label": "tempest-test-network--1730423673", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b5c686f73574cf6bda27fafd1e6a955", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea806abd-87", "ovs_interfaceid": "ea806abd-8726-49ab-bcf1-7bad4be962d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.646 183087 DEBUG nova.network.os_vif_util [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:7d:fb,bridge_name='br-int',has_traffic_filtering=True,id=ea806abd-8726-49ab-bcf1-7bad4be962d4,network=Network(c70bcd79-0ac3-47dd-befb-792bcecbc13c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea806abd-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.646 183087 DEBUG os_vif [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:7d:fb,bridge_name='br-int',has_traffic_filtering=True,id=ea806abd-8726-49ab-bcf1-7bad4be962d4,network=Network(c70bcd79-0ac3-47dd-befb-792bcecbc13c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea806abd-87') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.647 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.648 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea806abd-87, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.648 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.650 183087 INFO os_vif [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:7d:fb,bridge_name='br-int',has_traffic_filtering=True,id=ea806abd-8726-49ab-bcf1-7bad4be962d4,network=Network(c70bcd79-0ac3-47dd-befb-792bcecbc13c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea806abd-87')
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.650 183087 INFO nova.virt.libvirt.driver [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Deleting instance files /var/lib/nova/instances/ec9cf498-a940-4975-aa76-6a35fbda4fd7_del
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.650 183087 INFO nova.virt.libvirt.driver [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Deletion of /var/lib/nova/instances/ec9cf498-a940-4975-aa76-6a35fbda4fd7_del complete
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.730 183087 INFO nova.compute.manager [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Took 0.09 seconds to destroy the instance on the hypervisor.
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.731 183087 DEBUG nova.compute.claims [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c98541b80> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.731 183087 DEBUG oslo_concurrency.lockutils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.731 183087 DEBUG oslo_concurrency.lockutils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.833 183087 DEBUG nova.compute.provider_tree [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.866 183087 DEBUG nova.scheduler.client.report [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.886 183087 DEBUG oslo_concurrency.lockutils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.886 183087 DEBUG nova.compute.utils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.887 183087 ERROR nova.compute.manager [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Build of instance ec9cf498-a940-4975-aa76-6a35fbda4fd7 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance ec9cf498-a940-4975-aa76-6a35fbda4fd7 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.888 183087 DEBUG nova.compute.manager [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.888 183087 DEBUG nova.virt.libvirt.vif [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:44:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_igmp_snooping_same_network_and_unsubscribe-422610518',display_name='tempest-test_igmp_snooping_same_network_and_unsubscribe-422610518',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-test-igmp-snooping-same-network-and-unsubscribe-4226105',id=7,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLW3QctowAuTRomDzwX/50c0w4gj7v964zplpIppVscgxp9RpPQOzwEuDyUt6jByTqmBt67G7M/nmKVwfwyD2eB9ia5UAMUkghAJ2GRKx1D906KgHI4AV6DAOYWxJlQb7w==',key_name='tempest-keypair-test-1949113140',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8b5c686f73574cf6bda27fafd1e6a955',ramdisk_id='',reservation_id='r-phzeefni',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Common-937437123',owner_user_name='tempest-MulticastTestIPv4Common-937437123-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:44:30Z,user_data=None,user_id='7d6129e3a2464eb184d043c6f41a33e4',uuid=ec9cf498-a940-4975-aa76-6a35fbda4fd7,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea806abd-8726-49ab-bcf1-7bad4be962d4", "address": "fa:16:3e:05:7d:fb", "network": {"id": "c70bcd79-0ac3-47dd-befb-792bcecbc13c", "bridge": "br-int", "label": "tempest-test-network--1730423673", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b5c686f73574cf6bda27fafd1e6a955", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea806abd-87", "ovs_interfaceid": "ea806abd-8726-49ab-bcf1-7bad4be962d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.888 183087 DEBUG nova.network.os_vif_util [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Converting VIF {"id": "ea806abd-8726-49ab-bcf1-7bad4be962d4", "address": "fa:16:3e:05:7d:fb", "network": {"id": "c70bcd79-0ac3-47dd-befb-792bcecbc13c", "bridge": "br-int", "label": "tempest-test-network--1730423673", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b5c686f73574cf6bda27fafd1e6a955", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea806abd-87", "ovs_interfaceid": "ea806abd-8726-49ab-bcf1-7bad4be962d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.889 183087 DEBUG nova.network.os_vif_util [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:7d:fb,bridge_name='br-int',has_traffic_filtering=True,id=ea806abd-8726-49ab-bcf1-7bad4be962d4,network=Network(c70bcd79-0ac3-47dd-befb-792bcecbc13c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea806abd-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.889 183087 DEBUG os_vif [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:7d:fb,bridge_name='br-int',has_traffic_filtering=True,id=ea806abd-8726-49ab-bcf1-7bad4be962d4,network=Network(c70bcd79-0ac3-47dd-befb-792bcecbc13c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea806abd-87') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.890 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.891 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea806abd-87, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.891 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.893 183087 INFO os_vif [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:7d:fb,bridge_name='br-int',has_traffic_filtering=True,id=ea806abd-8726-49ab-bcf1-7bad4be962d4,network=Network(c70bcd79-0ac3-47dd-befb-792bcecbc13c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea806abd-87')
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.893 183087 DEBUG nova.compute.manager [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.893 183087 DEBUG nova.compute.manager [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:44:30 compute-1 nova_compute[183083]: 2026-01-26 08:44:30.893 183087 DEBUG nova.network.neutron [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:44:31 compute-1 nova_compute[183083]: 2026-01-26 08:44:31.630 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:31 compute-1 podman[212233]: 2026-01-26 08:44:31.832591512 +0000 UTC m=+0.084586468 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 08:44:32 compute-1 nova_compute[183083]: 2026-01-26 08:44:32.765 183087 DEBUG nova.network.neutron [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:44:32 compute-1 nova_compute[183083]: 2026-01-26 08:44:32.791 183087 INFO nova.compute.manager [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Took 1.90 seconds to deallocate network for instance.
Jan 26 08:44:33 compute-1 nova_compute[183083]: 2026-01-26 08:44:33.012 183087 INFO nova.scheduler.client.report [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Deleted allocations for instance ec9cf498-a940-4975-aa76-6a35fbda4fd7
Jan 26 08:44:33 compute-1 nova_compute[183083]: 2026-01-26 08:44:33.012 183087 DEBUG oslo_concurrency.lockutils [None req-41957d8e-8fb1-4675-873e-5e26b7b1161d 7d6129e3a2464eb184d043c6f41a33e4 8b5c686f73574cf6bda27fafd1e6a955 - - default default] Lock "ec9cf498-a940-4975-aa76-6a35fbda4fd7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.395s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:33 compute-1 nova_compute[183083]: 2026-01-26 08:44:33.637 183087 DEBUG nova.network.neutron [req-72147dc8-720f-4a5b-9088-77df95c5a7ea req-a5742708-da70-4785-a3be-42687c9b99d1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Updated VIF entry in instance network info cache for port ea806abd-8726-49ab-bcf1-7bad4be962d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:44:33 compute-1 nova_compute[183083]: 2026-01-26 08:44:33.638 183087 DEBUG nova.network.neutron [req-72147dc8-720f-4a5b-9088-77df95c5a7ea req-a5742708-da70-4785-a3be-42687c9b99d1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ec9cf498-a940-4975-aa76-6a35fbda4fd7] Updating instance_info_cache with network_info: [{"id": "ea806abd-8726-49ab-bcf1-7bad4be962d4", "address": "fa:16:3e:05:7d:fb", "network": {"id": "c70bcd79-0ac3-47dd-befb-792bcecbc13c", "bridge": "br-int", "label": "tempest-test-network--1730423673", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b5c686f73574cf6bda27fafd1e6a955", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea806abd-87", "ovs_interfaceid": "ea806abd-8726-49ab-bcf1-7bad4be962d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:44:33 compute-1 nova_compute[183083]: 2026-01-26 08:44:33.657 183087 DEBUG oslo_concurrency.lockutils [req-72147dc8-720f-4a5b-9088-77df95c5a7ea req-a5742708-da70-4785-a3be-42687c9b99d1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-ec9cf498-a940-4975-aa76-6a35fbda4fd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:44:34 compute-1 nova_compute[183083]: 2026-01-26 08:44:34.317 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:36 compute-1 nova_compute[183083]: 2026-01-26 08:44:36.634 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:36.838 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:44:36 compute-1 nova_compute[183083]: 2026-01-26 08:44:36.839 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:36.840 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:44:38 compute-1 sshd-session[212258]: Connection closed by authenticating user root 159.223.236.81 port 50648 [preauth]
Jan 26 08:44:39 compute-1 nova_compute[183083]: 2026-01-26 08:44:39.319 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:41 compute-1 nova_compute[183083]: 2026-01-26 08:44:41.639 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:41.843 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:44:43 compute-1 podman[212260]: 2026-01-26 08:44:43.843206845 +0000 UTC m=+0.091929121 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 26 08:44:43 compute-1 podman[212261]: 2026-01-26 08:44:43.8495934 +0000 UTC m=+0.093700907 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.245 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquiring lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.246 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.263 183087 DEBUG nova.compute.manager [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.323 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.347 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.348 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.360 183087 DEBUG nova.virt.hardware [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.360 183087 INFO nova.compute.claims [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.489 183087 DEBUG nova.compute.provider_tree [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.508 183087 DEBUG nova.scheduler.client.report [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.546 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.547 183087 DEBUG nova.compute.manager [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.608 183087 DEBUG nova.compute.manager [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.609 183087 DEBUG nova.network.neutron [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.634 183087 INFO nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.655 183087 DEBUG nova.compute.manager [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.770 183087 DEBUG nova.compute.manager [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.772 183087 DEBUG nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.773 183087 INFO nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Creating image(s)
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.774 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquiring lock "/var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.774 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "/var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.775 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "/var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.776 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:44 compute-1 nova_compute[183083]: 2026-01-26 08:44:44.777 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:45 compute-1 nova_compute[183083]: 2026-01-26 08:44:45.255 183087 DEBUG nova.policy [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4ba65f88f5c349ff8443da8191c3da2b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '605ef5b310d9405faa10f9c8f78d897f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:44:45 compute-1 sshd-session[212300]: Accepted publickey for zuul from 38.102.83.66 port 54294 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:44:45 compute-1 systemd-logind[788]: New session 32 of user zuul.
Jan 26 08:44:45 compute-1 systemd[1]: Started Session 32 of User zuul.
Jan 26 08:44:45 compute-1 sshd-session[212300]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:44:45 compute-1 sshd-session[212303]: Connection closed by 38.102.83.66 port 54294
Jan 26 08:44:45 compute-1 sshd-session[212300]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:44:45 compute-1 systemd[1]: session-32.scope: Deactivated successfully.
Jan 26 08:44:45 compute-1 systemd-logind[788]: Session 32 logged out. Waiting for processes to exit.
Jan 26 08:44:45 compute-1 systemd-logind[788]: Removed session 32.
Jan 26 08:44:46 compute-1 sshd-session[212328]: Accepted publickey for zuul from 38.102.83.66 port 54304 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:44:46 compute-1 systemd-logind[788]: New session 33 of user zuul.
Jan 26 08:44:46 compute-1 systemd[1]: Started Session 33 of User zuul.
Jan 26 08:44:46 compute-1 sshd-session[212328]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:44:46 compute-1 sshd-session[212331]: Connection closed by 38.102.83.66 port 54304
Jan 26 08:44:46 compute-1 sshd-session[212328]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:44:46 compute-1 systemd[1]: session-33.scope: Deactivated successfully.
Jan 26 08:44:46 compute-1 systemd-logind[788]: Session 33 logged out. Waiting for processes to exit.
Jan 26 08:44:46 compute-1 systemd-logind[788]: Removed session 33.
Jan 26 08:44:46 compute-1 nova_compute[183083]: 2026-01-26 08:44:46.507 183087 DEBUG oslo_concurrency.processutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:44:46 compute-1 nova_compute[183083]: 2026-01-26 08:44:46.604 183087 DEBUG oslo_concurrency.processutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52.part --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:44:46 compute-1 nova_compute[183083]: 2026-01-26 08:44:46.607 183087 DEBUG nova.virt.images [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] 13d1a20a-8003-4f19-aba7-ccbd9eff9b82 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 26 08:44:46 compute-1 nova_compute[183083]: 2026-01-26 08:44:46.609 183087 DEBUG nova.privsep.utils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 26 08:44:46 compute-1 nova_compute[183083]: 2026-01-26 08:44:46.610 183087 DEBUG oslo_concurrency.processutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52.part /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:44:46 compute-1 nova_compute[183083]: 2026-01-26 08:44:46.644 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:46 compute-1 nova_compute[183083]: 2026-01-26 08:44:46.807 183087 DEBUG oslo_concurrency.processutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52.part /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52.converted" returned: 0 in 0.197s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:44:46 compute-1 nova_compute[183083]: 2026-01-26 08:44:46.817 183087 DEBUG oslo_concurrency.processutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:44:46 compute-1 nova_compute[183083]: 2026-01-26 08:44:46.901 183087 DEBUG oslo_concurrency.processutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52.converted --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:44:46 compute-1 nova_compute[183083]: 2026-01-26 08:44:46.902 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:46 compute-1 nova_compute[183083]: 2026-01-26 08:44:46.917 183087 INFO oslo.privsep.daemon [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpirmuf7mg/privsep.sock']
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.105 183087 DEBUG nova.network.neutron [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Successfully created port: fe874242-d6f2-4922-8e79-b6545c0e8446 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.622 183087 INFO oslo.privsep.daemon [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Spawned new privsep daemon via rootwrap
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.486 212375 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.490 212375 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.493 212375 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.494 212375 INFO oslo.privsep.daemon [-] privsep daemon running as pid 212375
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.694 183087 DEBUG oslo_concurrency.processutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.746 183087 DEBUG oslo_concurrency.processutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.747 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.748 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.758 183087 DEBUG oslo_concurrency.processutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.813 183087 DEBUG oslo_concurrency.processutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.815 183087 DEBUG oslo_concurrency.processutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.867 183087 DEBUG oslo_concurrency.processutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.869 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.870 183087 DEBUG oslo_concurrency.processutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.944 183087 DEBUG oslo_concurrency.processutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.946 183087 DEBUG nova.virt.disk.api [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Checking if we can resize image /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 08:44:47 compute-1 nova_compute[183083]: 2026-01-26 08:44:47.947 183087 DEBUG oslo_concurrency.processutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:44:48 compute-1 nova_compute[183083]: 2026-01-26 08:44:48.011 183087 DEBUG oslo_concurrency.processutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:44:48 compute-1 nova_compute[183083]: 2026-01-26 08:44:48.013 183087 DEBUG nova.virt.disk.api [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Cannot resize image /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 08:44:48 compute-1 nova_compute[183083]: 2026-01-26 08:44:48.014 183087 DEBUG nova.objects.instance [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lazy-loading 'migration_context' on Instance uuid 52d0b676-cf9c-4840-8b66-74ca8b13e2af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:44:48 compute-1 nova_compute[183083]: 2026-01-26 08:44:48.035 183087 DEBUG nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 08:44:48 compute-1 nova_compute[183083]: 2026-01-26 08:44:48.036 183087 DEBUG nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Ensure instance console log exists: /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 08:44:48 compute-1 nova_compute[183083]: 2026-01-26 08:44:48.037 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:48 compute-1 nova_compute[183083]: 2026-01-26 08:44:48.037 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:48 compute-1 nova_compute[183083]: 2026-01-26 08:44:48.038 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:49 compute-1 nova_compute[183083]: 2026-01-26 08:44:49.073 183087 DEBUG nova.network.neutron [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Successfully updated port: fe874242-d6f2-4922-8e79-b6545c0e8446 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:44:49 compute-1 nova_compute[183083]: 2026-01-26 08:44:49.090 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquiring lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:44:49 compute-1 nova_compute[183083]: 2026-01-26 08:44:49.090 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquired lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:44:49 compute-1 nova_compute[183083]: 2026-01-26 08:44:49.091 183087 DEBUG nova.network.neutron [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:44:49 compute-1 nova_compute[183083]: 2026-01-26 08:44:49.288 183087 DEBUG nova.compute.manager [req-b5f3dd93-7715-4149-a60c-ff71b47c2a23 req-9e46e12a-6b52-4938-b21b-7a642da3f41c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received event network-changed-fe874242-d6f2-4922-8e79-b6545c0e8446 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:44:49 compute-1 nova_compute[183083]: 2026-01-26 08:44:49.288 183087 DEBUG nova.compute.manager [req-b5f3dd93-7715-4149-a60c-ff71b47c2a23 req-9e46e12a-6b52-4938-b21b-7a642da3f41c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Refreshing instance network info cache due to event network-changed-fe874242-d6f2-4922-8e79-b6545c0e8446. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:44:49 compute-1 nova_compute[183083]: 2026-01-26 08:44:49.288 183087 DEBUG oslo_concurrency.lockutils [req-b5f3dd93-7715-4149-a60c-ff71b47c2a23 req-9e46e12a-6b52-4938-b21b-7a642da3f41c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:44:49 compute-1 nova_compute[183083]: 2026-01-26 08:44:49.324 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:49 compute-1 nova_compute[183083]: 2026-01-26 08:44:49.453 183087 DEBUG nova.network.neutron [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.326 183087 DEBUG nova.network.neutron [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Updating instance_info_cache with network_info: [{"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.346 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Releasing lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.347 183087 DEBUG nova.compute.manager [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Instance network_info: |[{"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.347 183087 DEBUG oslo_concurrency.lockutils [req-b5f3dd93-7715-4149-a60c-ff71b47c2a23 req-9e46e12a-6b52-4938-b21b-7a642da3f41c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.348 183087 DEBUG nova.network.neutron [req-b5f3dd93-7715-4149-a60c-ff71b47c2a23 req-9e46e12a-6b52-4938-b21b-7a642da3f41c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Refreshing network info cache for port fe874242-d6f2-4922-8e79-b6545c0e8446 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.353 183087 DEBUG nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Start _get_guest_xml network_info=[{"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encryption_options': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.361 183087 WARNING nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.366 183087 DEBUG nova.virt.libvirt.host [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.367 183087 DEBUG nova.virt.libvirt.host [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.380 183087 DEBUG nova.virt.libvirt.host [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.381 183087 DEBUG nova.virt.libvirt.host [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.382 183087 DEBUG nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.382 183087 DEBUG nova.virt.hardware [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T08:43:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7017323-cd35-470d-a718-faa6c6e97277',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.383 183087 DEBUG nova.virt.hardware [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.384 183087 DEBUG nova.virt.hardware [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.384 183087 DEBUG nova.virt.hardware [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.384 183087 DEBUG nova.virt.hardware [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.385 183087 DEBUG nova.virt.hardware [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.385 183087 DEBUG nova.virt.hardware [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.386 183087 DEBUG nova.virt.hardware [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.386 183087 DEBUG nova.virt.hardware [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.387 183087 DEBUG nova.virt.hardware [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.387 183087 DEBUG nova.virt.hardware [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.392 183087 DEBUG nova.privsep.utils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.394 183087 DEBUG nova.virt.libvirt.vif [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:44:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1487231681',display_name='tempest-server-test-1487231681',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1487231681',id=9,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKwpz9wNHHnbJAOOiUppbeyhm9nehnF1Htd8OXU0NdYnRfrosih4iG9UOQxrdGQsm8346olWW9k6G/UQO8gqKcXblbCmZkMf68uZMMkWh2h2m7GZ6H2smfZpLCUdxJAWPg==',key_name='tempest-keypair-test-84578535',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='605ef5b310d9405faa10f9c8f78d897f',ramdisk_id='',reservation_id='r-e8sz2804',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkBasicTest-1834146433',owner_user_name='tempest-NetworkBasicTest-1834146433-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:44:44Z,user_data=None,user_id='4ba65f88f5c349ff8443da8191c3da2b',uuid=52d0b676-cf9c-4840-8b66-74ca8b13e2af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.395 183087 DEBUG nova.network.os_vif_util [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Converting VIF {"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.396 183087 DEBUG nova.network.os_vif_util [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:29:f7,bridge_name='br-int',has_traffic_filtering=True,id=fe874242-d6f2-4922-8e79-b6545c0e8446,network=Network(fe7e4448-8407-46f2-95a0-344b2f6ecfd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe874242-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.399 183087 DEBUG nova.objects.instance [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lazy-loading 'pci_devices' on Instance uuid 52d0b676-cf9c-4840-8b66-74ca8b13e2af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.426 183087 DEBUG nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] End _get_guest_xml xml=<domain type="kvm">
Jan 26 08:44:50 compute-1 nova_compute[183083]:   <uuid>52d0b676-cf9c-4840-8b66-74ca8b13e2af</uuid>
Jan 26 08:44:50 compute-1 nova_compute[183083]:   <name>instance-00000009</name>
Jan 26 08:44:50 compute-1 nova_compute[183083]:   <memory>131072</memory>
Jan 26 08:44:50 compute-1 nova_compute[183083]:   <vcpu>1</vcpu>
Jan 26 08:44:50 compute-1 nova_compute[183083]:   <metadata>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <nova:name>tempest-server-test-1487231681</nova:name>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <nova:creationTime>2026-01-26 08:44:50</nova:creationTime>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <nova:flavor name="m1.nano">
Jan 26 08:44:50 compute-1 nova_compute[183083]:         <nova:memory>128</nova:memory>
Jan 26 08:44:50 compute-1 nova_compute[183083]:         <nova:disk>1</nova:disk>
Jan 26 08:44:50 compute-1 nova_compute[183083]:         <nova:swap>0</nova:swap>
Jan 26 08:44:50 compute-1 nova_compute[183083]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 08:44:50 compute-1 nova_compute[183083]:         <nova:vcpus>1</nova:vcpus>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       </nova:flavor>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <nova:owner>
Jan 26 08:44:50 compute-1 nova_compute[183083]:         <nova:user uuid="4ba65f88f5c349ff8443da8191c3da2b">tempest-NetworkBasicTest-1834146433-project-member</nova:user>
Jan 26 08:44:50 compute-1 nova_compute[183083]:         <nova:project uuid="605ef5b310d9405faa10f9c8f78d897f">tempest-NetworkBasicTest-1834146433</nova:project>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       </nova:owner>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <nova:root type="image" uuid="13d1a20a-8003-4f19-aba7-ccbd9eff9b82"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <nova:ports>
Jan 26 08:44:50 compute-1 nova_compute[183083]:         <nova:port uuid="fe874242-d6f2-4922-8e79-b6545c0e8446">
Jan 26 08:44:50 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       </nova:ports>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     </nova:instance>
Jan 26 08:44:50 compute-1 nova_compute[183083]:   </metadata>
Jan 26 08:44:50 compute-1 nova_compute[183083]:   <sysinfo type="smbios">
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <system>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <entry name="manufacturer">RDO</entry>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <entry name="product">OpenStack Compute</entry>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <entry name="serial">52d0b676-cf9c-4840-8b66-74ca8b13e2af</entry>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <entry name="uuid">52d0b676-cf9c-4840-8b66-74ca8b13e2af</entry>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <entry name="family">Virtual Machine</entry>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     </system>
Jan 26 08:44:50 compute-1 nova_compute[183083]:   </sysinfo>
Jan 26 08:44:50 compute-1 nova_compute[183083]:   <os>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <boot dev="hd"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <smbios mode="sysinfo"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:   </os>
Jan 26 08:44:50 compute-1 nova_compute[183083]:   <features>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <acpi/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <apic/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <vmcoreinfo/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:   </features>
Jan 26 08:44:50 compute-1 nova_compute[183083]:   <clock offset="utc">
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <timer name="hpet" present="no"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:   </clock>
Jan 26 08:44:50 compute-1 nova_compute[183083]:   <cpu mode="host-model" match="exact">
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:   </cpu>
Jan 26 08:44:50 compute-1 nova_compute[183083]:   <devices>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <disk type="file" device="disk">
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <target dev="vda" bus="virtio"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <disk type="file" device="cdrom">
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk.config"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <target dev="sda" bus="sata"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:48:29:f7"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <target dev="tapfe874242-d6"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <serial type="pty">
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <log file="/var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/console.log" append="off"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     </serial>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <video>
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     </video>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <input type="tablet" bus="usb"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <rng model="virtio">
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <backend model="random">/dev/urandom</backend>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     </rng>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <controller type="usb" index="0"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     <memballoon model="virtio">
Jan 26 08:44:50 compute-1 nova_compute[183083]:       <stats period="10"/>
Jan 26 08:44:50 compute-1 nova_compute[183083]:     </memballoon>
Jan 26 08:44:50 compute-1 nova_compute[183083]:   </devices>
Jan 26 08:44:50 compute-1 nova_compute[183083]: </domain>
Jan 26 08:44:50 compute-1 nova_compute[183083]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.427 183087 DEBUG nova.compute.manager [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Preparing to wait for external event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.427 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquiring lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.428 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.428 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.429 183087 DEBUG nova.virt.libvirt.vif [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:44:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1487231681',display_name='tempest-server-test-1487231681',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1487231681',id=9,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKwpz9wNHHnbJAOOiUppbeyhm9nehnF1Htd8OXU0NdYnRfrosih4iG9UOQxrdGQsm8346olWW9k6G/UQO8gqKcXblbCmZkMf68uZMMkWh2h2m7GZ6H2smfZpLCUdxJAWPg==',key_name='tempest-keypair-test-84578535',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='605ef5b310d9405faa10f9c8f78d897f',ramdisk_id='',reservation_id='r-e8sz2804',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkBasicTest-1834146433',owner_user_name='tempest-NetworkBasicTest-1834146433-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:44:44Z,user_data=None,user_id='4ba65f88f5c349ff8443da8191c3da2b',uuid=52d0b676-cf9c-4840-8b66-74ca8b13e2af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.429 183087 DEBUG nova.network.os_vif_util [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Converting VIF {"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.430 183087 DEBUG nova.network.os_vif_util [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:29:f7,bridge_name='br-int',has_traffic_filtering=True,id=fe874242-d6f2-4922-8e79-b6545c0e8446,network=Network(fe7e4448-8407-46f2-95a0-344b2f6ecfd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe874242-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.431 183087 DEBUG os_vif [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:29:f7,bridge_name='br-int',has_traffic_filtering=True,id=fe874242-d6f2-4922-8e79-b6545c0e8446,network=Network(fe7e4448-8407-46f2-95a0-344b2f6ecfd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe874242-d6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.431 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.432 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.432 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.435 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.435 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe874242-d6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.436 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfe874242-d6, col_values=(('external_ids', {'iface-id': 'fe874242-d6f2-4922-8e79-b6545c0e8446', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:48:29:f7', 'vm-uuid': '52d0b676-cf9c-4840-8b66-74ca8b13e2af'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:44:50 compute-1 NetworkManager[55451]: <info>  [1769417090.4391] manager: (tapfe874242-d6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.440 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.447 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.450 183087 INFO os_vif [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:29:f7,bridge_name='br-int',has_traffic_filtering=True,id=fe874242-d6f2-4922-8e79-b6545c0e8446,network=Network(fe7e4448-8407-46f2-95a0-344b2f6ecfd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe874242-d6')
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.514 183087 DEBUG nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.515 183087 DEBUG nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.516 183087 DEBUG nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] No VIF found with MAC fa:16:3e:48:29:f7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 08:44:50 compute-1 nova_compute[183083]: 2026-01-26 08:44:50.517 183087 INFO nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Using config drive
Jan 26 08:44:50 compute-1 podman[212395]: 2026-01-26 08:44:50.815420663 +0000 UTC m=+0.069586565 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 08:44:50 compute-1 podman[212394]: 2026-01-26 08:44:50.926255411 +0000 UTC m=+0.181877601 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 26 08:44:51 compute-1 nova_compute[183083]: 2026-01-26 08:44:51.173 183087 INFO nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Creating config drive at /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk.config
Jan 26 08:44:51 compute-1 nova_compute[183083]: 2026-01-26 08:44:51.182 183087 DEBUG oslo_concurrency.processutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvg4mu2ud execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:44:51 compute-1 nova_compute[183083]: 2026-01-26 08:44:51.322 183087 DEBUG oslo_concurrency.processutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvg4mu2ud" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:44:51 compute-1 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 26 08:44:51 compute-1 kernel: tapfe874242-d6: entered promiscuous mode
Jan 26 08:44:51 compute-1 NetworkManager[55451]: <info>  [1769417091.4366] manager: (tapfe874242-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Jan 26 08:44:51 compute-1 ovn_controller[95352]: 2026-01-26T08:44:51Z|00038|binding|INFO|Claiming lport fe874242-d6f2-4922-8e79-b6545c0e8446 for this chassis.
Jan 26 08:44:51 compute-1 ovn_controller[95352]: 2026-01-26T08:44:51Z|00039|binding|INFO|fe874242-d6f2-4922-8e79-b6545c0e8446: Claiming fa:16:3e:48:29:f7 10.100.0.24
Jan 26 08:44:51 compute-1 nova_compute[183083]: 2026-01-26 08:44:51.438 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:51 compute-1 nova_compute[183083]: 2026-01-26 08:44:51.446 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:51.463 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:29:f7 10.100.0.24'], port_security=['fa:16:3e:48:29:f7 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': '52d0b676-cf9c-4840-8b66-74ca8b13e2af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '605ef5b310d9405faa10f9c8f78d897f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '18ebfc50-0648-455b-8cfa-279844a56dcf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70edddae-db7d-4a4d-8983-c3a06c961ec1, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=fe874242-d6f2-4922-8e79-b6545c0e8446) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:44:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:51.465 104632 INFO neutron.agent.ovn.metadata.agent [-] Port fe874242-d6f2-4922-8e79-b6545c0e8446 in datapath fe7e4448-8407-46f2-95a0-344b2f6ecfd7 bound to our chassis
Jan 26 08:44:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:51.469 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fe7e4448-8407-46f2-95a0-344b2f6ecfd7
Jan 26 08:44:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:51.471 104632 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpufi5_jz6/privsep.sock']
Jan 26 08:44:51 compute-1 systemd-udevd[212466]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:44:51 compute-1 NetworkManager[55451]: <info>  [1769417091.5114] device (tapfe874242-d6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 08:44:51 compute-1 NetworkManager[55451]: <info>  [1769417091.5126] device (tapfe874242-d6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 08:44:51 compute-1 systemd-machined[154360]: New machine qemu-1-instance-00000009.
Jan 26 08:44:51 compute-1 nova_compute[183083]: 2026-01-26 08:44:51.539 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:51 compute-1 ovn_controller[95352]: 2026-01-26T08:44:51Z|00040|binding|INFO|Setting lport fe874242-d6f2-4922-8e79-b6545c0e8446 ovn-installed in OVS
Jan 26 08:44:51 compute-1 ovn_controller[95352]: 2026-01-26T08:44:51Z|00041|binding|INFO|Setting lport fe874242-d6f2-4922-8e79-b6545c0e8446 up in Southbound
Jan 26 08:44:51 compute-1 nova_compute[183083]: 2026-01-26 08:44:51.545 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:51 compute-1 systemd[1]: Started Virtual Machine qemu-1-instance-00000009.
Jan 26 08:44:51 compute-1 nova_compute[183083]: 2026-01-26 08:44:51.895 183087 DEBUG nova.network.neutron [req-b5f3dd93-7715-4149-a60c-ff71b47c2a23 req-9e46e12a-6b52-4938-b21b-7a642da3f41c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Updated VIF entry in instance network info cache for port fe874242-d6f2-4922-8e79-b6545c0e8446. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:44:51 compute-1 nova_compute[183083]: 2026-01-26 08:44:51.895 183087 DEBUG nova.network.neutron [req-b5f3dd93-7715-4149-a60c-ff71b47c2a23 req-9e46e12a-6b52-4938-b21b-7a642da3f41c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Updating instance_info_cache with network_info: [{"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:44:51 compute-1 nova_compute[183083]: 2026-01-26 08:44:51.910 183087 DEBUG oslo_concurrency.lockutils [req-b5f3dd93-7715-4149-a60c-ff71b47c2a23 req-9e46e12a-6b52-4938-b21b-7a642da3f41c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:44:52 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:52.173 104632 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 26 08:44:52 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:52.175 104632 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpufi5_jz6/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 26 08:44:52 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:52.068 212483 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 08:44:52 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:52.073 212483 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 08:44:52 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:52.075 212483 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Jan 26 08:44:52 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:52.076 212483 INFO oslo.privsep.daemon [-] privsep daemon running as pid 212483
Jan 26 08:44:52 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:52.178 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[83a05d00-a8a6-4f4c-af20-2bf324fc8745]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.370 183087 DEBUG nova.compute.manager [req-7e34ccfa-54cb-43ab-bde7-1fe0b4f46d0f req-b3bc206c-60f9-4aff-b7ae-24439e4fb6e3 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.370 183087 DEBUG oslo_concurrency.lockutils [req-7e34ccfa-54cb-43ab-bde7-1fe0b4f46d0f req-b3bc206c-60f9-4aff-b7ae-24439e4fb6e3 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.371 183087 DEBUG oslo_concurrency.lockutils [req-7e34ccfa-54cb-43ab-bde7-1fe0b4f46d0f req-b3bc206c-60f9-4aff-b7ae-24439e4fb6e3 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.372 183087 DEBUG oslo_concurrency.lockutils [req-7e34ccfa-54cb-43ab-bde7-1fe0b4f46d0f req-b3bc206c-60f9-4aff-b7ae-24439e4fb6e3 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.372 183087 DEBUG nova.compute.manager [req-7e34ccfa-54cb-43ab-bde7-1fe0b4f46d0f req-b3bc206c-60f9-4aff-b7ae-24439e4fb6e3 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Processing event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.378 183087 DEBUG nova.compute.manager [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.380 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417092.3782916, 52d0b676-cf9c-4840-8b66-74ca8b13e2af => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.380 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] VM Started (Lifecycle Event)
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.384 183087 DEBUG nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.388 183087 INFO nova.virt.libvirt.driver [-] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Instance spawned successfully.
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.388 183087 DEBUG nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.416 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.423 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.430 183087 DEBUG nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.435 183087 DEBUG nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.437 183087 DEBUG nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.438 183087 DEBUG nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.439 183087 DEBUG nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.440 183087 DEBUG nova.virt.libvirt.driver [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.453 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.454 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417092.3784554, 52d0b676-cf9c-4840-8b66-74ca8b13e2af => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.455 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] VM Paused (Lifecycle Event)
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.472 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.476 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417092.3839142, 52d0b676-cf9c-4840-8b66-74ca8b13e2af => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.477 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] VM Resumed (Lifecycle Event)
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.504 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.507 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.552 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.562 183087 INFO nova.compute.manager [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Took 7.79 seconds to spawn the instance on the hypervisor.
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.563 183087 DEBUG nova.compute.manager [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.627 183087 INFO nova.compute.manager [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Took 8.32 seconds to build instance.
Jan 26 08:44:52 compute-1 nova_compute[183083]: 2026-01-26 08:44:52.651 183087 DEBUG oslo_concurrency.lockutils [None req-6ce86c09-e35b-41cf-a36c-41215e37d262 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.405s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:52 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:52.681 212483 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:52 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:52.681 212483 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:52 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:52.681 212483 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:52 compute-1 podman[212495]: 2026-01-26 08:44:52.824728523 +0000 UTC m=+0.068258820 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 26 08:44:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:53.207 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[d2567f97-17ba-43c0-b068-d0f051a87721]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:53.208 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfe7e4448-81 in ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 08:44:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:53.210 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfe7e4448-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 08:44:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:53.211 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[f65e4414-bcce-4820-95a0-714f5cde5bd5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:53.214 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[29d46ebe-8807-4671-9e91-6f554bf1747b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:53.246 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[49a0266f-e21d-4821-bbfa-836403630487]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:53.272 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[a29e1d57-4b46-413e-859f-406c209c7a2f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:53.274 104632 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmprd2aagoo/privsep.sock']
Jan 26 08:44:54 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:54.152 104632 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 26 08:44:54 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:54.153 104632 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmprd2aagoo/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 26 08:44:54 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:53.968 212523 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 08:44:54 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:53.996 212523 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 08:44:54 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:53.999 212523 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 26 08:44:54 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:53.999 212523 INFO oslo.privsep.daemon [-] privsep daemon running as pid 212523
Jan 26 08:44:54 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:54.157 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[5242ed5e-ca58-4930-bf53-51b78cf1cf3f]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:54 compute-1 nova_compute[183083]: 2026-01-26 08:44:54.366 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:54 compute-1 nova_compute[183083]: 2026-01-26 08:44:54.374 183087 INFO nova.compute.manager [None req-12c4d838-cb02-46fe-aea4-4876bd9920e2 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Get console output
Jan 26 08:44:54 compute-1 nova_compute[183083]: 2026-01-26 08:44:54.480 183087 DEBUG nova.compute.manager [req-ff97cfaa-be4e-47fe-bd8c-c2f173d25b35 req-25d392f2-c0c1-4521-b68c-d2eb4254bc4a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:44:54 compute-1 nova_compute[183083]: 2026-01-26 08:44:54.481 183087 DEBUG oslo_concurrency.lockutils [req-ff97cfaa-be4e-47fe-bd8c-c2f173d25b35 req-25d392f2-c0c1-4521-b68c-d2eb4254bc4a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:54 compute-1 nova_compute[183083]: 2026-01-26 08:44:54.482 183087 DEBUG oslo_concurrency.lockutils [req-ff97cfaa-be4e-47fe-bd8c-c2f173d25b35 req-25d392f2-c0c1-4521-b68c-d2eb4254bc4a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:54 compute-1 nova_compute[183083]: 2026-01-26 08:44:54.482 183087 DEBUG oslo_concurrency.lockutils [req-ff97cfaa-be4e-47fe-bd8c-c2f173d25b35 req-25d392f2-c0c1-4521-b68c-d2eb4254bc4a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:54 compute-1 nova_compute[183083]: 2026-01-26 08:44:54.483 183087 DEBUG nova.compute.manager [req-ff97cfaa-be4e-47fe-bd8c-c2f173d25b35 req-25d392f2-c0c1-4521-b68c-d2eb4254bc4a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] No waiting events found dispatching network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:44:54 compute-1 nova_compute[183083]: 2026-01-26 08:44:54.483 183087 WARNING nova.compute.manager [req-ff97cfaa-be4e-47fe-bd8c-c2f173d25b35 req-25d392f2-c0c1-4521-b68c-d2eb4254bc4a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received unexpected event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 for instance with vm_state active and task_state None.
Jan 26 08:44:54 compute-1 nova_compute[183083]: 2026-01-26 08:44:54.499 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:44:54 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:54.700 212523 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:54 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:54.700 212523 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:54 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:54.700 212523 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:55.346 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[646765bf-cd1e-4662-adf7-d0b70d6acd60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:55 compute-1 NetworkManager[55451]: <info>  [1769417095.3726] manager: (tapfe7e4448-80): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:55.374 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[a73d0a51-d1f0-46ab-a7ad-f3db34bca8ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:55.424 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[2a35077b-c84d-4761-98eb-9d407496a75f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:55 compute-1 systemd-udevd[212535]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:55.429 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[b0939ee5-6f23-4d39-9b7c-d30a6560cbe8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:55 compute-1 NetworkManager[55451]: <info>  [1769417095.4722] device (tapfe7e4448-80): carrier: link connected
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:55.476 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[111ff1a0-b24f-4283-a4bf-fee68eaae59d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:55 compute-1 nova_compute[183083]: 2026-01-26 08:44:55.487 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:55.503 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[35b8a5f7-831b-48ec-9212-7eece22210ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe7e4448-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:af:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 343607, 'reachable_time': 21720, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212553, 'error': None, 'target': 'ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:55.534 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[e8160f9a-ed01-45ff-a8d8-37823597bbf5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:aff4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 343607, 'tstamp': 343607}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212554, 'error': None, 'target': 'ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:55.554 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[47fb6791-b514-4327-8a89-280ed4fe9c2c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe7e4448-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:af:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 343607, 'reachable_time': 21720, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212555, 'error': None, 'target': 'ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:55.593 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[97be6731-2478-4a50-b105-1d44e32e18f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:55.660 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[2001304c-c756-478c-9e39-58e816ee16fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:55.662 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe7e4448-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:55.663 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:55.663 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe7e4448-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:44:55 compute-1 NetworkManager[55451]: <info>  [1769417095.6662] manager: (tapfe7e4448-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Jan 26 08:44:55 compute-1 nova_compute[183083]: 2026-01-26 08:44:55.665 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:55 compute-1 kernel: tapfe7e4448-80: entered promiscuous mode
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:55.670 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfe7e4448-80, col_values=(('external_ids', {'iface-id': 'cca059f9-68bd-4ebe-b4a1-3e95ff36d483'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:44:55 compute-1 nova_compute[183083]: 2026-01-26 08:44:55.669 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:55 compute-1 nova_compute[183083]: 2026-01-26 08:44:55.672 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:55 compute-1 ovn_controller[95352]: 2026-01-26T08:44:55Z|00042|binding|INFO|Releasing lport cca059f9-68bd-4ebe-b4a1-3e95ff36d483 from this chassis (sb_readonly=0)
Jan 26 08:44:55 compute-1 nova_compute[183083]: 2026-01-26 08:44:55.695 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:55.696 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fe7e4448-8407-46f2-95a0-344b2f6ecfd7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fe7e4448-8407-46f2-95a0-344b2f6ecfd7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:55.697 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[261fec84-01c2-4df1-a7b6-7c7626fa2495]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:55.698 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: global
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-fe7e4448-8407-46f2-95a0-344b2f6ecfd7
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/fe7e4448-8407-46f2-95a0-344b2f6ecfd7.pid.haproxy
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID fe7e4448-8407-46f2-95a0-344b2f6ecfd7
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 08:44:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:55.699 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'env', 'PROCESS_TAG=haproxy-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fe7e4448-8407-46f2-95a0-344b2f6ecfd7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 08:44:55 compute-1 nova_compute[183083]: 2026-01-26 08:44:55.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:44:55 compute-1 nova_compute[183083]: 2026-01-26 08:44:55.962 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:44:56 compute-1 podman[212588]: 2026-01-26 08:44:56.205018249 +0000 UTC m=+0.073121355 container create 28388be22aa865f78d370f5502de54a7b3b9fe33ee81e457d8c62fae42fad38a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 26 08:44:56 compute-1 systemd[1]: Started libpod-conmon-28388be22aa865f78d370f5502de54a7b3b9fe33ee81e457d8c62fae42fad38a.scope.
Jan 26 08:44:56 compute-1 podman[212588]: 2026-01-26 08:44:56.16859436 +0000 UTC m=+0.036697456 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 08:44:56 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:44:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c430780bddcb6da8f88ac4beb2a687f893fb67194cc7ddc359ac3a775a128955/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 08:44:56 compute-1 podman[212588]: 2026-01-26 08:44:56.30748451 +0000 UTC m=+0.175587656 container init 28388be22aa865f78d370f5502de54a7b3b9fe33ee81e457d8c62fae42fad38a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 08:44:56 compute-1 podman[212588]: 2026-01-26 08:44:56.314009269 +0000 UTC m=+0.182112375 container start 28388be22aa865f78d370f5502de54a7b3b9fe33ee81e457d8c62fae42fad38a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 26 08:44:56 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[212603]: [NOTICE]   (212607) : New worker (212609) forked
Jan 26 08:44:56 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[212603]: [NOTICE]   (212607) : Loading success.
Jan 26 08:44:56 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:56.924 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:49:3d 192.168.0.2 2001::f816:3eff:fe69:493d'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.2/24 2001::f816:3eff:fe69:493d/64', 'neutron:device_id': 'ovnmeta-7abdae0f-d781-46cd-b99d-b879dccae5ad', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7abdae0f-d781-46cd-b99d-b879dccae5ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71cced1777f24868932d789154ff04a0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aef3833f-010d-4949-aeea-8a00ccfdc96d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b9b3d31f-00ba-452f-b887-73c2420cdc67) old=Port_Binding(mac=['fa:16:3e:69:49:3d 192.168.0.2'], external_ids={'neutron:cidrs': '192.168.0.2/24', 'neutron:device_id': 'ovnmeta-7abdae0f-d781-46cd-b99d-b879dccae5ad', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7abdae0f-d781-46cd-b99d-b879dccae5ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71cced1777f24868932d789154ff04a0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:44:56 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:56.925 104632 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b9b3d31f-00ba-452f-b887-73c2420cdc67 in datapath 7abdae0f-d781-46cd-b99d-b879dccae5ad updated
Jan 26 08:44:56 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:56.927 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7abdae0f-d781-46cd-b99d-b879dccae5ad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 08:44:56 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:44:56.928 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[d6cdafee-5423-491c-b487-d429a0bd5556]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:44:56 compute-1 nova_compute[183083]: 2026-01-26 08:44:56.961 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:44:57 compute-1 nova_compute[183083]: 2026-01-26 08:44:57.948 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:44:57 compute-1 nova_compute[183083]: 2026-01-26 08:44:57.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.489 183087 DEBUG oslo_concurrency.lockutils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquiring lock "99a22969-2a75-47c5-b7e5-70e7bbb9f1e7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.490 183087 DEBUG oslo_concurrency.lockutils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "99a22969-2a75-47c5-b7e5-70e7bbb9f1e7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.510 183087 DEBUG nova.compute.manager [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.611 183087 DEBUG oslo_concurrency.lockutils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.612 183087 DEBUG oslo_concurrency.lockutils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.621 183087 DEBUG nova.virt.hardware [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.621 183087 INFO nova.compute.claims [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.770 183087 DEBUG nova.compute.provider_tree [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Updating inventory in ProviderTree for provider 5203935e-446c-4e03-93fa-4c60d651e045 with inventory: {'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.798 183087 ERROR nova.scheduler.client.report [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [req-e2fd24b3-64ad-431d-962a-aac71c7225ba] Failed to update inventory to [{'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 5203935e-446c-4e03-93fa-4c60d651e045.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-e2fd24b3-64ad-431d-962a-aac71c7225ba"}]}
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.815 183087 DEBUG nova.scheduler.client.report [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Refreshing inventories for resource provider 5203935e-446c-4e03-93fa-4c60d651e045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.836 183087 DEBUG nova.scheduler.client.report [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Updating ProviderTree inventory for provider 5203935e-446c-4e03-93fa-4c60d651e045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.836 183087 DEBUG nova.compute.provider_tree [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Updating inventory in ProviderTree for provider 5203935e-446c-4e03-93fa-4c60d651e045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.854 183087 DEBUG nova.scheduler.client.report [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Refreshing aggregate associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.883 183087 DEBUG nova.scheduler.client.report [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Refreshing trait associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.959 183087 DEBUG nova.compute.provider_tree [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Updating inventory in ProviderTree for provider 5203935e-446c-4e03-93fa-4c60d651e045 with inventory: {'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 08:44:58 compute-1 nova_compute[183083]: 2026-01-26 08:44:58.986 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.020 183087 DEBUG nova.scheduler.client.report [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Updated inventory for provider 5203935e-446c-4e03-93fa-4c60d651e045 with generation 7 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.021 183087 DEBUG nova.compute.provider_tree [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Updating resource provider 5203935e-446c-4e03-93fa-4c60d651e045 generation from 7 to 8 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.021 183087 DEBUG nova.compute.provider_tree [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Updating inventory in ProviderTree for provider 5203935e-446c-4e03-93fa-4c60d651e045 with inventory: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.047 183087 DEBUG oslo_concurrency.lockutils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.435s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.049 183087 DEBUG nova.compute.manager [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.139 183087 DEBUG nova.compute.manager [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.139 183087 DEBUG nova.network.neutron [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.190 183087 INFO nova.virt.libvirt.driver [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.218 183087 DEBUG nova.compute.manager [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.300 183087 DEBUG nova.compute.manager [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.302 183087 DEBUG nova.virt.libvirt.driver [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.303 183087 INFO nova.virt.libvirt.driver [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Creating image(s)
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.304 183087 DEBUG oslo_concurrency.lockutils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquiring lock "/var/lib/nova/instances/99a22969-2a75-47c5-b7e5-70e7bbb9f1e7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.304 183087 DEBUG oslo_concurrency.lockutils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "/var/lib/nova/instances/99a22969-2a75-47c5-b7e5-70e7bbb9f1e7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.305 183087 DEBUG oslo_concurrency.lockutils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "/var/lib/nova/instances/99a22969-2a75-47c5-b7e5-70e7bbb9f1e7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.306 183087 DEBUG oslo_concurrency.lockutils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.306 183087 DEBUG oslo_concurrency.lockutils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.321 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.322 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquired lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.322 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.322 183087 DEBUG nova.objects.instance [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 52d0b676-cf9c-4840-8b66-74ca8b13e2af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.369 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.672 183087 INFO nova.compute.manager [None req-bc96fc64-beeb-4007-8338-7868b55aab94 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Get console output
Jan 26 08:44:59 compute-1 nova_compute[183083]: 2026-01-26 08:44:59.688 183087 DEBUG nova.policy [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '64bdc9f771e449a8930bafc62d000e64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e482dc8c944c4dc1ba301e69d00ec101', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.465 183087 DEBUG oslo_concurrency.lockutils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Traceback (most recent call last):
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]     raise exception.ImageUnacceptable(
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] 
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] During handling of the above exception, another exception occurred:
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] 
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Traceback (most recent call last):
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]     yield resources
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]     created_disks = self._create_and_inject_local_root(
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]     image.cache(fetch_func=fetch_func,
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]     return f(*args, **kwargs)
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7]     raise exception.ImageUnacceptable(
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.466 183087 ERROR nova.compute.manager [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] 
Jan 26 08:45:00 compute-1 nova_compute[183083]: 2026-01-26 08:45:00.490 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:02 compute-1 podman[212619]: 2026-01-26 08:45:02.857027532 +0000 UTC m=+0.099255800 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.033 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Updating instance_info_cache with network_info: [{"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.059 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Releasing lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.060 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.061 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.061 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.062 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.062 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.087 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.087 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.088 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.088 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.102 183087 DEBUG nova.network.neutron [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Successfully updated port: 97472a0b-2098-4d62-80c5-4c046df1dec0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.115 183087 DEBUG oslo_concurrency.lockutils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquiring lock "refresh_cache-99a22969-2a75-47c5-b7e5-70e7bbb9f1e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.116 183087 DEBUG oslo_concurrency.lockutils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquired lock "refresh_cache-99a22969-2a75-47c5-b7e5-70e7bbb9f1e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.116 183087 DEBUG nova.network.neutron [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.190 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.265 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.267 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.291 183087 DEBUG nova.compute.manager [req-3493468a-013d-4654-ac1a-47fb6951bf2b req-08f29765-ea5d-4147-8cf9-9f519037d02b 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Received event network-changed-97472a0b-2098-4d62-80c5-4c046df1dec0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.292 183087 DEBUG nova.compute.manager [req-3493468a-013d-4654-ac1a-47fb6951bf2b req-08f29765-ea5d-4147-8cf9-9f519037d02b 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Refreshing instance network info cache due to event network-changed-97472a0b-2098-4d62-80c5-4c046df1dec0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.293 183087 DEBUG oslo_concurrency.lockutils [req-3493468a-013d-4654-ac1a-47fb6951bf2b req-08f29765-ea5d-4147-8cf9-9f519037d02b 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-99a22969-2a75-47c5-b7e5-70e7bbb9f1e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.322 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.374 183087 DEBUG nova.network.neutron [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.514 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.516 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13577MB free_disk=113.0742301940918GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.516 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.516 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.623 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance 52d0b676-cf9c-4840-8b66-74ca8b13e2af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.624 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.624 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.625 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1664MB phys_disk=119GB used_disk=11GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:45:03 compute-1 ovn_controller[95352]: 2026-01-26T08:45:03Z|00043|pinctrl|WARN|Dropped 4079 log messages in last 60 seconds (most recently, 0 seconds ago) due to excessive rate
Jan 26 08:45:03 compute-1 ovn_controller[95352]: 2026-01-26T08:45:03Z|00044|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:45:03 compute-1 ovn_controller[95352]: 2026-01-26T08:45:03Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:48:29:f7 10.100.0.24
Jan 26 08:45:03 compute-1 ovn_controller[95352]: 2026-01-26T08:45:03Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:48:29:f7 10.100.0.24
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.733 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.755 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.776 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:45:03 compute-1 nova_compute[183083]: 2026-01-26 08:45:03.777 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:04 compute-1 nova_compute[183083]: 2026-01-26 08:45:04.372 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:04 compute-1 nova_compute[183083]: 2026-01-26 08:45:04.773 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:45:04 compute-1 nova_compute[183083]: 2026-01-26 08:45:04.822 183087 INFO nova.compute.manager [None req-92e9741e-771b-4165-8bfc-ba9a53eae5f4 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Get console output
Jan 26 08:45:04 compute-1 nova_compute[183083]: 2026-01-26 08:45:04.830 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:45:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:05.296 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:05.297 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:05.297 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.528 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.739 183087 DEBUG nova.network.neutron [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Updating instance_info_cache with network_info: [{"id": "97472a0b-2098-4d62-80c5-4c046df1dec0", "address": "fa:16:3e:1b:10:83", "network": {"id": "b8422cbe-c8cc-4fb1-a705-7effa0ccfe0e", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "2001:3::/64", "dns": [], "gateway": {"address": "2001:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:3::1ff", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}, {"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97472a0b-20", "ovs_interfaceid": "97472a0b-2098-4d62-80c5-4c046df1dec0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.758 183087 DEBUG oslo_concurrency.lockutils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Releasing lock "refresh_cache-99a22969-2a75-47c5-b7e5-70e7bbb9f1e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.758 183087 DEBUG nova.compute.manager [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Instance network_info: |[{"id": "97472a0b-2098-4d62-80c5-4c046df1dec0", "address": "fa:16:3e:1b:10:83", "network": {"id": "b8422cbe-c8cc-4fb1-a705-7effa0ccfe0e", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "2001:3::/64", "dns": [], "gateway": {"address": "2001:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:3::1ff", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}, {"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97472a0b-20", "ovs_interfaceid": "97472a0b-2098-4d62-80c5-4c046df1dec0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.759 183087 DEBUG oslo_concurrency.lockutils [req-3493468a-013d-4654-ac1a-47fb6951bf2b req-08f29765-ea5d-4147-8cf9-9f519037d02b 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-99a22969-2a75-47c5-b7e5-70e7bbb9f1e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.759 183087 DEBUG nova.network.neutron [req-3493468a-013d-4654-ac1a-47fb6951bf2b req-08f29765-ea5d-4147-8cf9-9f519037d02b 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Refreshing network info cache for port 97472a0b-2098-4d62-80c5-4c046df1dec0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.761 183087 INFO nova.compute.manager [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Terminating instance
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.763 183087 DEBUG nova.compute.manager [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.768 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.768 183087 INFO nova.virt.libvirt.driver [-] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Instance destroyed successfully.
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.770 183087 DEBUG nova.virt.libvirt.vif [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:44:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateful',display_name='tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateful',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-extradhcpoptionstest-1086481355-test-extra-dhcp-opts-ip',id=10,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDbcjoHylAQtqg1Vl94jB8Y6+G/LV2rXvLmP1GPnqiVf11BnABlWcWIGa5wBxNzQ5L6qFsQZodIPzUEOuiX2g/H28q0JdYWcV+hRwWSFfpY7UxCYzUgdJhIE18mhMw36ag==',key_name='tempest-ExtraDhcpOptionsTest-1086481355',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e482dc8c944c4dc1ba301e69d00ec101',ramdisk_id='',reservation_id='r-fic21xrj',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ExtraDhcpOptionsTest-992702786',owner_user_name='tempest-ExtraDhcpOptionsTest-992702786-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:44:59Z,user_data=None,user_id='64bdc9f771e449a8930bafc62d000e64',uuid=99a22969-2a75-47c5-b7e5-70e7bbb9f1e7,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "97472a0b-2098-4d62-80c5-4c046df1dec0", "address": "fa:16:3e:1b:10:83", "network": {"id": "b8422cbe-c8cc-4fb1-a705-7effa0ccfe0e", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "2001:3::/64", "dns": [], "gateway": {"address": "2001:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:3::1ff", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}, {"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97472a0b-20", "ovs_interfaceid": "97472a0b-2098-4d62-80c5-4c046df1dec0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.770 183087 DEBUG nova.network.os_vif_util [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Converting VIF {"id": "97472a0b-2098-4d62-80c5-4c046df1dec0", "address": "fa:16:3e:1b:10:83", "network": {"id": "b8422cbe-c8cc-4fb1-a705-7effa0ccfe0e", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "2001:3::/64", "dns": [], "gateway": {"address": "2001:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:3::1ff", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}, {"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97472a0b-20", "ovs_interfaceid": "97472a0b-2098-4d62-80c5-4c046df1dec0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.772 183087 DEBUG nova.network.os_vif_util [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:10:83,bridge_name='br-int',has_traffic_filtering=True,id=97472a0b-2098-4d62-80c5-4c046df1dec0,network=Network(b8422cbe-c8cc-4fb1-a705-7effa0ccfe0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap97472a0b-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.772 183087 DEBUG os_vif [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:10:83,bridge_name='br-int',has_traffic_filtering=True,id=97472a0b-2098-4d62-80c5-4c046df1dec0,network=Network(b8422cbe-c8cc-4fb1-a705-7effa0ccfe0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap97472a0b-20') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.776 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.777 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap97472a0b-20, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.777 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.785 183087 INFO os_vif [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:10:83,bridge_name='br-int',has_traffic_filtering=True,id=97472a0b-2098-4d62-80c5-4c046df1dec0,network=Network(b8422cbe-c8cc-4fb1-a705-7effa0ccfe0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap97472a0b-20')
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.786 183087 INFO nova.virt.libvirt.driver [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Deleting instance files /var/lib/nova/instances/99a22969-2a75-47c5-b7e5-70e7bbb9f1e7_del
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.787 183087 INFO nova.virt.libvirt.driver [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Deletion of /var/lib/nova/instances/99a22969-2a75-47c5-b7e5-70e7bbb9f1e7_del complete
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.835 183087 INFO nova.compute.manager [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Took 0.07 seconds to destroy the instance on the hypervisor.
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.837 183087 DEBUG nova.compute.claims [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c98471ee0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.838 183087 DEBUG oslo_concurrency.lockutils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.838 183087 DEBUG oslo_concurrency.lockutils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.950 183087 DEBUG nova.compute.provider_tree [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.968 183087 DEBUG nova.scheduler.client.report [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.989 183087 DEBUG oslo_concurrency.lockutils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.989 183087 DEBUG nova.compute.utils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.990 183087 ERROR nova.compute.manager [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Build of instance 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.991 183087 DEBUG nova.compute.manager [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.992 183087 DEBUG nova.virt.libvirt.vif [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:44:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateful',display_name='tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateful',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-extradhcpoptionstest-1086481355-test-extra-dhcp-opts-ip',id=10,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDbcjoHylAQtqg1Vl94jB8Y6+G/LV2rXvLmP1GPnqiVf11BnABlWcWIGa5wBxNzQ5L6qFsQZodIPzUEOuiX2g/H28q0JdYWcV+hRwWSFfpY7UxCYzUgdJhIE18mhMw36ag==',key_name='tempest-ExtraDhcpOptionsTest-1086481355',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e482dc8c944c4dc1ba301e69d00ec101',ramdisk_id='',reservation_id='r-fic21xrj',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ExtraDhcpOptionsTest-992702786',owner_user_name='tempest-ExtraDhcpOptionsTest-992702786-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:45:05Z,user_data=None,user_id='64bdc9f771e449a8930bafc62d000e64',uuid=99a22969-2a75-47c5-b7e5-70e7bbb9f1e7,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "97472a0b-2098-4d62-80c5-4c046df1dec0", "address": "fa:16:3e:1b:10:83", "network": {"id": "b8422cbe-c8cc-4fb1-a705-7effa0ccfe0e", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "2001:3::/64", "dns": [], "gateway": {"address": "2001:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:3::1ff", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}, {"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97472a0b-20", "ovs_interfaceid": "97472a0b-2098-4d62-80c5-4c046df1dec0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.992 183087 DEBUG nova.network.os_vif_util [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Converting VIF {"id": "97472a0b-2098-4d62-80c5-4c046df1dec0", "address": "fa:16:3e:1b:10:83", "network": {"id": "b8422cbe-c8cc-4fb1-a705-7effa0ccfe0e", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "2001:3::/64", "dns": [], "gateway": {"address": "2001:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:3::1ff", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}, {"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97472a0b-20", "ovs_interfaceid": "97472a0b-2098-4d62-80c5-4c046df1dec0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.993 183087 DEBUG nova.network.os_vif_util [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:10:83,bridge_name='br-int',has_traffic_filtering=True,id=97472a0b-2098-4d62-80c5-4c046df1dec0,network=Network(b8422cbe-c8cc-4fb1-a705-7effa0ccfe0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap97472a0b-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.993 183087 DEBUG os_vif [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:10:83,bridge_name='br-int',has_traffic_filtering=True,id=97472a0b-2098-4d62-80c5-4c046df1dec0,network=Network(b8422cbe-c8cc-4fb1-a705-7effa0ccfe0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap97472a0b-20') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.995 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.995 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap97472a0b-20, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.995 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.997 183087 INFO os_vif [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:10:83,bridge_name='br-int',has_traffic_filtering=True,id=97472a0b-2098-4d62-80c5-4c046df1dec0,network=Network(b8422cbe-c8cc-4fb1-a705-7effa0ccfe0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap97472a0b-20')
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.998 183087 DEBUG nova.compute.manager [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.998 183087 DEBUG nova.compute.manager [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:45:05 compute-1 nova_compute[183083]: 2026-01-26 08:45:05.998 183087 DEBUG nova.network.neutron [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:45:07 compute-1 nova_compute[183083]: 2026-01-26 08:45:07.561 183087 DEBUG nova.network.neutron [req-3493468a-013d-4654-ac1a-47fb6951bf2b req-08f29765-ea5d-4147-8cf9-9f519037d02b 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Updated VIF entry in instance network info cache for port 97472a0b-2098-4d62-80c5-4c046df1dec0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:45:07 compute-1 nova_compute[183083]: 2026-01-26 08:45:07.562 183087 DEBUG nova.network.neutron [req-3493468a-013d-4654-ac1a-47fb6951bf2b req-08f29765-ea5d-4147-8cf9-9f519037d02b 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Updating instance_info_cache with network_info: [{"id": "97472a0b-2098-4d62-80c5-4c046df1dec0", "address": "fa:16:3e:1b:10:83", "network": {"id": "b8422cbe-c8cc-4fb1-a705-7effa0ccfe0e", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "2001:3::/64", "dns": [], "gateway": {"address": "2001:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:3::1ff", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}, {"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97472a0b-20", "ovs_interfaceid": "97472a0b-2098-4d62-80c5-4c046df1dec0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:07 compute-1 nova_compute[183083]: 2026-01-26 08:45:07.576 183087 DEBUG oslo_concurrency.lockutils [req-3493468a-013d-4654-ac1a-47fb6951bf2b req-08f29765-ea5d-4147-8cf9-9f519037d02b 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-99a22969-2a75-47c5-b7e5-70e7bbb9f1e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:45:09 compute-1 nova_compute[183083]: 2026-01-26 08:45:09.248 183087 DEBUG nova.network.neutron [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:09 compute-1 nova_compute[183083]: 2026-01-26 08:45:09.270 183087 INFO nova.compute.manager [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7] Took 3.27 seconds to deallocate network for instance.
Jan 26 08:45:09 compute-1 nova_compute[183083]: 2026-01-26 08:45:09.420 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:09 compute-1 nova_compute[183083]: 2026-01-26 08:45:09.462 183087 INFO nova.scheduler.client.report [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Deleted allocations for instance 99a22969-2a75-47c5-b7e5-70e7bbb9f1e7
Jan 26 08:45:09 compute-1 nova_compute[183083]: 2026-01-26 08:45:09.463 183087 DEBUG oslo_concurrency.lockutils [None req-03106bf6-8f82-4a07-ab70-7a6fa888c5ed 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "99a22969-2a75-47c5-b7e5-70e7bbb9f1e7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.973s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:09 compute-1 nova_compute[183083]: 2026-01-26 08:45:09.496 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:09 compute-1 NetworkManager[55451]: <info>  [1769417109.4983] manager: (patch-br-int-to-provnet-149e76db-406a-40c9-b6a7-879b1da420de): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/24)
Jan 26 08:45:09 compute-1 NetworkManager[55451]: <info>  [1769417109.4995] device (patch-br-int-to-provnet-149e76db-406a-40c9-b6a7-879b1da420de)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 08:45:09 compute-1 NetworkManager[55451]: <warn>  [1769417109.4997] device (patch-br-int-to-provnet-149e76db-406a-40c9-b6a7-879b1da420de)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 08:45:09 compute-1 NetworkManager[55451]: <info>  [1769417109.5014] manager: (patch-provnet-149e76db-406a-40c9-b6a7-879b1da420de-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/25)
Jan 26 08:45:09 compute-1 NetworkManager[55451]: <info>  [1769417109.5024] device (patch-provnet-149e76db-406a-40c9-b6a7-879b1da420de-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 08:45:09 compute-1 NetworkManager[55451]: <warn>  [1769417109.5025] device (patch-provnet-149e76db-406a-40c9-b6a7-879b1da420de-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 08:45:09 compute-1 NetworkManager[55451]: <info>  [1769417109.5040] manager: (patch-br-int-to-provnet-149e76db-406a-40c9-b6a7-879b1da420de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Jan 26 08:45:09 compute-1 NetworkManager[55451]: <info>  [1769417109.5052] manager: (patch-provnet-149e76db-406a-40c9-b6a7-879b1da420de-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Jan 26 08:45:09 compute-1 NetworkManager[55451]: <info>  [1769417109.5060] device (patch-br-int-to-provnet-149e76db-406a-40c9-b6a7-879b1da420de)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 26 08:45:09 compute-1 NetworkManager[55451]: <info>  [1769417109.5066] device (patch-provnet-149e76db-406a-40c9-b6a7-879b1da420de-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 26 08:45:09 compute-1 nova_compute[183083]: 2026-01-26 08:45:09.822 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:09 compute-1 ovn_controller[95352]: 2026-01-26T08:45:09Z|00045|binding|INFO|Releasing lport cca059f9-68bd-4ebe-b4a1-3e95ff36d483 from this chassis (sb_readonly=0)
Jan 26 08:45:09 compute-1 nova_compute[183083]: 2026-01-26 08:45:09.867 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:10 compute-1 nova_compute[183083]: 2026-01-26 08:45:10.242 183087 DEBUG nova.compute.manager [req-063da58d-2529-463b-9228-e1da2ebf3b49 req-72b7e09a-4d7e-4970-a4c4-d3a5c61d14bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received event network-changed-fe874242-d6f2-4922-8e79-b6545c0e8446 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:10 compute-1 nova_compute[183083]: 2026-01-26 08:45:10.243 183087 DEBUG nova.compute.manager [req-063da58d-2529-463b-9228-e1da2ebf3b49 req-72b7e09a-4d7e-4970-a4c4-d3a5c61d14bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Refreshing instance network info cache due to event network-changed-fe874242-d6f2-4922-8e79-b6545c0e8446. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:45:10 compute-1 nova_compute[183083]: 2026-01-26 08:45:10.243 183087 DEBUG oslo_concurrency.lockutils [req-063da58d-2529-463b-9228-e1da2ebf3b49 req-72b7e09a-4d7e-4970-a4c4-d3a5c61d14bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:45:10 compute-1 nova_compute[183083]: 2026-01-26 08:45:10.244 183087 DEBUG oslo_concurrency.lockutils [req-063da58d-2529-463b-9228-e1da2ebf3b49 req-72b7e09a-4d7e-4970-a4c4-d3a5c61d14bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:45:10 compute-1 nova_compute[183083]: 2026-01-26 08:45:10.244 183087 DEBUG nova.network.neutron [req-063da58d-2529-463b-9228-e1da2ebf3b49 req-72b7e09a-4d7e-4970-a4c4-d3a5c61d14bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Refreshing network info cache for port fe874242-d6f2-4922-8e79-b6545c0e8446 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:45:10 compute-1 nova_compute[183083]: 2026-01-26 08:45:10.419 183087 DEBUG oslo_concurrency.lockutils [None req-07aa5dc5-5a8d-4827-89b6-acbc9d2e7e24 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquiring lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:10 compute-1 nova_compute[183083]: 2026-01-26 08:45:10.420 183087 DEBUG oslo_concurrency.lockutils [None req-07aa5dc5-5a8d-4827-89b6-acbc9d2e7e24 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:10 compute-1 nova_compute[183083]: 2026-01-26 08:45:10.421 183087 INFO nova.compute.manager [None req-07aa5dc5-5a8d-4827-89b6-acbc9d2e7e24 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Rebooting instance
Jan 26 08:45:10 compute-1 nova_compute[183083]: 2026-01-26 08:45:10.519 183087 DEBUG oslo_concurrency.lockutils [None req-07aa5dc5-5a8d-4827-89b6-acbc9d2e7e24 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquiring lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:45:10 compute-1 nova_compute[183083]: 2026-01-26 08:45:10.529 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:12 compute-1 nova_compute[183083]: 2026-01-26 08:45:12.327 183087 DEBUG nova.network.neutron [req-063da58d-2529-463b-9228-e1da2ebf3b49 req-72b7e09a-4d7e-4970-a4c4-d3a5c61d14bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Updated VIF entry in instance network info cache for port fe874242-d6f2-4922-8e79-b6545c0e8446. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:45:12 compute-1 nova_compute[183083]: 2026-01-26 08:45:12.328 183087 DEBUG nova.network.neutron [req-063da58d-2529-463b-9228-e1da2ebf3b49 req-72b7e09a-4d7e-4970-a4c4-d3a5c61d14bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Updating instance_info_cache with network_info: [{"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:12 compute-1 nova_compute[183083]: 2026-01-26 08:45:12.350 183087 DEBUG oslo_concurrency.lockutils [req-063da58d-2529-463b-9228-e1da2ebf3b49 req-72b7e09a-4d7e-4970-a4c4-d3a5c61d14bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:45:12 compute-1 nova_compute[183083]: 2026-01-26 08:45:12.352 183087 DEBUG oslo_concurrency.lockutils [None req-07aa5dc5-5a8d-4827-89b6-acbc9d2e7e24 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquired lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:45:12 compute-1 nova_compute[183083]: 2026-01-26 08:45:12.352 183087 DEBUG nova.network.neutron [None req-07aa5dc5-5a8d-4827-89b6-acbc9d2e7e24 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:45:13 compute-1 nova_compute[183083]: 2026-01-26 08:45:13.921 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:14 compute-1 nova_compute[183083]: 2026-01-26 08:45:14.423 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:14 compute-1 podman[212675]: 2026-01-26 08:45:14.816002094 +0000 UTC m=+0.071422553 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 26 08:45:14 compute-1 podman[212676]: 2026-01-26 08:45:14.844345914 +0000 UTC m=+0.081808850 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, version=9.6, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, io.buildah.version=1.33.7, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41)
Jan 26 08:45:15 compute-1 nova_compute[183083]: 2026-01-26 08:45:15.029 183087 DEBUG nova.network.neutron [None req-07aa5dc5-5a8d-4827-89b6-acbc9d2e7e24 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Updating instance_info_cache with network_info: [{"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:15 compute-1 nova_compute[183083]: 2026-01-26 08:45:15.058 183087 DEBUG oslo_concurrency.lockutils [None req-07aa5dc5-5a8d-4827-89b6-acbc9d2e7e24 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Releasing lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:45:15 compute-1 nova_compute[183083]: 2026-01-26 08:45:15.059 183087 DEBUG nova.compute.manager [None req-07aa5dc5-5a8d-4827-89b6-acbc9d2e7e24 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:45:15 compute-1 nova_compute[183083]: 2026-01-26 08:45:15.565 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:16 compute-1 nova_compute[183083]: 2026-01-26 08:45:16.079 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:16 compute-1 nova_compute[183083]: 2026-01-26 08:45:16.213 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:16 compute-1 nova_compute[183083]: 2026-01-26 08:45:16.729 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:17 compute-1 kernel: tapfe874242-d6 (unregistering): left promiscuous mode
Jan 26 08:45:17 compute-1 NetworkManager[55451]: <info>  [1769417117.4044] device (tapfe874242-d6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 08:45:17 compute-1 ovn_controller[95352]: 2026-01-26T08:45:17Z|00046|binding|INFO|Releasing lport fe874242-d6f2-4922-8e79-b6545c0e8446 from this chassis (sb_readonly=0)
Jan 26 08:45:17 compute-1 ovn_controller[95352]: 2026-01-26T08:45:17Z|00047|binding|INFO|Setting lport fe874242-d6f2-4922-8e79-b6545c0e8446 down in Southbound
Jan 26 08:45:17 compute-1 nova_compute[183083]: 2026-01-26 08:45:17.416 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:17 compute-1 ovn_controller[95352]: 2026-01-26T08:45:17Z|00048|binding|INFO|Removing iface tapfe874242-d6 ovn-installed in OVS
Jan 26 08:45:17 compute-1 nova_compute[183083]: 2026-01-26 08:45:17.422 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:17.426 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:29:f7 10.100.0.24'], port_security=['fa:16:3e:48:29:f7 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': '52d0b676-cf9c-4840-8b66-74ca8b13e2af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '605ef5b310d9405faa10f9c8f78d897f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '18ebfc50-0648-455b-8cfa-279844a56dcf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.173'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70edddae-db7d-4a4d-8983-c3a06c961ec1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=fe874242-d6f2-4922-8e79-b6545c0e8446) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:45:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:17.428 104632 INFO neutron.agent.ovn.metadata.agent [-] Port fe874242-d6f2-4922-8e79-b6545c0e8446 in datapath fe7e4448-8407-46f2-95a0-344b2f6ecfd7 unbound from our chassis
Jan 26 08:45:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:17.432 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fe7e4448-8407-46f2-95a0-344b2f6ecfd7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 08:45:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:17.433 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[ae5220e4-f326-498e-923b-894dcd51dc0a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:17.434 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7 namespace which is not needed anymore
Jan 26 08:45:17 compute-1 nova_compute[183083]: 2026-01-26 08:45:17.440 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:17 compute-1 nova_compute[183083]: 2026-01-26 08:45:17.493 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:17 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000009.scope: Deactivated successfully.
Jan 26 08:45:17 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000009.scope: Consumed 13.112s CPU time.
Jan 26 08:45:17 compute-1 systemd-machined[154360]: Machine qemu-1-instance-00000009 terminated.
Jan 26 08:45:17 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[212603]: [NOTICE]   (212607) : haproxy version is 2.8.14-c23fe91
Jan 26 08:45:17 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[212603]: [NOTICE]   (212607) : path to executable is /usr/sbin/haproxy
Jan 26 08:45:17 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[212603]: [WARNING]  (212607) : Exiting Master process...
Jan 26 08:45:17 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[212603]: [ALERT]    (212607) : Current worker (212609) exited with code 143 (Terminated)
Jan 26 08:45:17 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[212603]: [WARNING]  (212607) : All workers exited. Exiting... (0)
Jan 26 08:45:17 compute-1 systemd[1]: libpod-28388be22aa865f78d370f5502de54a7b3b9fe33ee81e457d8c62fae42fad38a.scope: Deactivated successfully.
Jan 26 08:45:17 compute-1 podman[212741]: 2026-01-26 08:45:17.595929462 +0000 UTC m=+0.049707413 container died 28388be22aa865f78d370f5502de54a7b3b9fe33ee81e457d8c62fae42fad38a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 26 08:45:17 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-28388be22aa865f78d370f5502de54a7b3b9fe33ee81e457d8c62fae42fad38a-userdata-shm.mount: Deactivated successfully.
Jan 26 08:45:17 compute-1 systemd[1]: var-lib-containers-storage-overlay-c430780bddcb6da8f88ac4beb2a687f893fb67194cc7ddc359ac3a775a128955-merged.mount: Deactivated successfully.
Jan 26 08:45:17 compute-1 podman[212741]: 2026-01-26 08:45:17.650403945 +0000 UTC m=+0.104181906 container cleanup 28388be22aa865f78d370f5502de54a7b3b9fe33ee81e457d8c62fae42fad38a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 26 08:45:17 compute-1 systemd[1]: libpod-conmon-28388be22aa865f78d370f5502de54a7b3b9fe33ee81e457d8c62fae42fad38a.scope: Deactivated successfully.
Jan 26 08:45:17 compute-1 podman[212785]: 2026-01-26 08:45:17.730388737 +0000 UTC m=+0.050248686 container remove 28388be22aa865f78d370f5502de54a7b3b9fe33ee81e457d8c62fae42fad38a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 08:45:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:17.739 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[a5a9ae86-0b47-4154-8dfe-c5f480a2bffb]: (4, ('Mon Jan 26 08:45:17 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7 (28388be22aa865f78d370f5502de54a7b3b9fe33ee81e457d8c62fae42fad38a)\n28388be22aa865f78d370f5502de54a7b3b9fe33ee81e457d8c62fae42fad38a\nMon Jan 26 08:45:17 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7 (28388be22aa865f78d370f5502de54a7b3b9fe33ee81e457d8c62fae42fad38a)\n28388be22aa865f78d370f5502de54a7b3b9fe33ee81e457d8c62fae42fad38a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:17.742 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b3ce7b97-9864-45e4-b28d-8cdf0c951f1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:17.743 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe7e4448-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:17 compute-1 nova_compute[183083]: 2026-01-26 08:45:17.746 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:17 compute-1 kernel: tapfe7e4448-80: left promiscuous mode
Jan 26 08:45:17 compute-1 nova_compute[183083]: 2026-01-26 08:45:17.766 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:17.770 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b597684a-7872-4345-8371-e7da42da3836]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:17.784 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[84be9787-cd47-4121-8520-2f5b5a8cfd52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:17.786 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[9b4590fb-b835-4a09-a3e9-5ab6d2a2baf3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:17.806 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[a481507d-0caf-456e-b51f-af93c2de7143]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 343594, 'reachable_time': 17162, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212806, 'error': None, 'target': 'ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:17.818 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 08:45:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:17.819 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[c5f91dda-40c7-4f79-a4ac-4a9a308d4557]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:17 compute-1 systemd[1]: run-netns-ovnmeta\x2dfe7e4448\x2d8407\x2d46f2\x2d95a0\x2d344b2f6ecfd7.mount: Deactivated successfully.
Jan 26 08:45:18 compute-1 nova_compute[183083]: 2026-01-26 08:45:18.205 183087 INFO nova.virt.libvirt.driver [None req-07aa5dc5-5a8d-4827-89b6-acbc9d2e7e24 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Instance shutdown successfully.
Jan 26 08:45:18 compute-1 kernel: tapfe874242-d6: entered promiscuous mode
Jan 26 08:45:18 compute-1 systemd-udevd[212720]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:45:18 compute-1 NetworkManager[55451]: <info>  [1769417118.2887] manager: (tapfe874242-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Jan 26 08:45:18 compute-1 ovn_controller[95352]: 2026-01-26T08:45:18Z|00049|binding|INFO|Claiming lport fe874242-d6f2-4922-8e79-b6545c0e8446 for this chassis.
Jan 26 08:45:18 compute-1 ovn_controller[95352]: 2026-01-26T08:45:18Z|00050|binding|INFO|fe874242-d6f2-4922-8e79-b6545c0e8446: Claiming fa:16:3e:48:29:f7 10.100.0.24
Jan 26 08:45:18 compute-1 nova_compute[183083]: 2026-01-26 08:45:18.289 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.299 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:29:f7 10.100.0.24'], port_security=['fa:16:3e:48:29:f7 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': '52d0b676-cf9c-4840-8b66-74ca8b13e2af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '605ef5b310d9405faa10f9c8f78d897f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '18ebfc50-0648-455b-8cfa-279844a56dcf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.173'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70edddae-db7d-4a4d-8983-c3a06c961ec1, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=fe874242-d6f2-4922-8e79-b6545c0e8446) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.301 104632 INFO neutron.agent.ovn.metadata.agent [-] Port fe874242-d6f2-4922-8e79-b6545c0e8446 in datapath fe7e4448-8407-46f2-95a0-344b2f6ecfd7 bound to our chassis
Jan 26 08:45:18 compute-1 NetworkManager[55451]: <info>  [1769417118.3065] device (tapfe874242-d6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.308 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fe7e4448-8407-46f2-95a0-344b2f6ecfd7
Jan 26 08:45:18 compute-1 NetworkManager[55451]: <info>  [1769417118.3088] device (tapfe874242-d6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 08:45:18 compute-1 ovn_controller[95352]: 2026-01-26T08:45:18Z|00051|binding|INFO|Setting lport fe874242-d6f2-4922-8e79-b6545c0e8446 ovn-installed in OVS
Jan 26 08:45:18 compute-1 ovn_controller[95352]: 2026-01-26T08:45:18Z|00052|binding|INFO|Setting lport fe874242-d6f2-4922-8e79-b6545c0e8446 up in Southbound
Jan 26 08:45:18 compute-1 nova_compute[183083]: 2026-01-26 08:45:18.319 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.327 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[42a56d6c-e6a8-4dd6-8d4d-18ee2f04b0d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.328 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfe7e4448-81 in ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.330 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfe7e4448-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.330 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[c5740dae-fb79-4aec-8826-ef77f3cd0999]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.331 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[c2905dcb-fddc-42e7-8d75-2b39d1d87331]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.350 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[f08384a3-f61e-451c-a93b-b08a724a2d8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:18 compute-1 systemd-machined[154360]: New machine qemu-2-instance-00000009.
Jan 26 08:45:18 compute-1 systemd[1]: Started Virtual Machine qemu-2-instance-00000009.
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.383 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f8cd99-4f36-4f24-83f6-93047fcb5519]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.422 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[75b87c7d-8534-484a-ad81-4c9933275876]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:18 compute-1 NetworkManager[55451]: <info>  [1769417118.4311] manager: (tapfe7e4448-80): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.430 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[9b52e6fe-3145-4e3e-9d7e-5214864723dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.477 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[c1e956f9-b27f-463d-8cf5-1c92f6efaa2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.481 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[f079d11e-2e4e-4e3d-8db9-3fbdf33741e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:18 compute-1 NetworkManager[55451]: <info>  [1769417118.5159] device (tapfe7e4448-80): carrier: link connected
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.524 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[05401c0c-843d-49bd-9091-b65dd77af989]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.552 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[a530405d-2764-43e7-82cd-a2bbdb796def]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe7e4448-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:af:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345912, 'reachable_time': 23644, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212853, 'error': None, 'target': 'ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.575 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[25e75007-b646-43bd-a28b-902225072a3f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:aff4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 345912, 'tstamp': 345912}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212854, 'error': None, 'target': 'ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.600 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[71dbbe42-605f-413d-b877-600ca69b791d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe7e4448-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:af:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345912, 'reachable_time': 23644, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212855, 'error': None, 'target': 'ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.642 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[91f9ce26-6188-4f94-ae4e-6fe0e7cb9860]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.710 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[665da459-1c29-4315-8d54-5a97adf73d7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.712 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe7e4448-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.712 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.713 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe7e4448-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:18 compute-1 kernel: tapfe7e4448-80: entered promiscuous mode
Jan 26 08:45:18 compute-1 nova_compute[183083]: 2026-01-26 08:45:18.715 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:18 compute-1 NetworkManager[55451]: <info>  [1769417118.7194] manager: (tapfe7e4448-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.720 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfe7e4448-80, col_values=(('external_ids', {'iface-id': 'cca059f9-68bd-4ebe-b4a1-3e95ff36d483'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:18 compute-1 nova_compute[183083]: 2026-01-26 08:45:18.720 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:18 compute-1 ovn_controller[95352]: 2026-01-26T08:45:18Z|00053|binding|INFO|Releasing lport cca059f9-68bd-4ebe-b4a1-3e95ff36d483 from this chassis (sb_readonly=0)
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.746 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fe7e4448-8407-46f2-95a0-344b2f6ecfd7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fe7e4448-8407-46f2-95a0-344b2f6ecfd7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 08:45:18 compute-1 nova_compute[183083]: 2026-01-26 08:45:18.745 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:18 compute-1 nova_compute[183083]: 2026-01-26 08:45:18.746 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.747 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b604c039-c0bf-4307-850b-4aa99f5d8cf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.748 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: global
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-fe7e4448-8407-46f2-95a0-344b2f6ecfd7
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/fe7e4448-8407-46f2-95a0-344b2f6ecfd7.pid.haproxy
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID fe7e4448-8407-46f2-95a0-344b2f6ecfd7
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 08:45:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:18.749 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'env', 'PROCESS_TAG=haproxy-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fe7e4448-8407-46f2-95a0-344b2f6ecfd7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 08:45:18 compute-1 nova_compute[183083]: 2026-01-26 08:45:18.926 183087 DEBUG nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Removed pending event for 52d0b676-cf9c-4840-8b66-74ca8b13e2af due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 26 08:45:18 compute-1 nova_compute[183083]: 2026-01-26 08:45:18.927 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417118.9251072, 52d0b676-cf9c-4840-8b66-74ca8b13e2af => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:45:18 compute-1 nova_compute[183083]: 2026-01-26 08:45:18.927 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] VM Resumed (Lifecycle Event)
Jan 26 08:45:18 compute-1 nova_compute[183083]: 2026-01-26 08:45:18.932 183087 INFO nova.virt.libvirt.driver [-] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Instance running successfully.
Jan 26 08:45:18 compute-1 nova_compute[183083]: 2026-01-26 08:45:18.932 183087 INFO nova.virt.libvirt.driver [None req-07aa5dc5-5a8d-4827-89b6-acbc9d2e7e24 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Instance soft rebooted successfully.
Jan 26 08:45:18 compute-1 nova_compute[183083]: 2026-01-26 08:45:18.933 183087 DEBUG nova.compute.manager [None req-07aa5dc5-5a8d-4827-89b6-acbc9d2e7e24 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:45:18 compute-1 ovn_controller[95352]: 2026-01-26T08:45:18Z|00054|memory|INFO|peak resident set size grew 71% in last 1010.4 seconds, from 16384 kB to 28032 kB
Jan 26 08:45:18 compute-1 ovn_controller[95352]: 2026-01-26T08:45:18Z|00055|memory|INFO|idl-cells-OVN_Southbound:16066 idl-cells-Open_vSwitch:756 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:591 lflow-cache-entries-cache-matches:381 lflow-cache-size-KB:2639 local_datapath_usage-KB:4 ofctrl_desired_flow_usage-KB:1011 ofctrl_installed_flow_usage-KB:738 ofctrl_sb_flow_ref_usage-KB:375
Jan 26 08:45:18 compute-1 nova_compute[183083]: 2026-01-26 08:45:18.953 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:45:18 compute-1 nova_compute[183083]: 2026-01-26 08:45:18.956 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:45:18 compute-1 nova_compute[183083]: 2026-01-26 08:45:18.982 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] During sync_power_state the instance has a pending task (reboot_started). Skip.
Jan 26 08:45:18 compute-1 nova_compute[183083]: 2026-01-26 08:45:18.983 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417118.926704, 52d0b676-cf9c-4840-8b66-74ca8b13e2af => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:45:18 compute-1 nova_compute[183083]: 2026-01-26 08:45:18.983 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] VM Started (Lifecycle Event)
Jan 26 08:45:19 compute-1 nova_compute[183083]: 2026-01-26 08:45:19.012 183087 DEBUG oslo_concurrency.lockutils [None req-07aa5dc5-5a8d-4827-89b6-acbc9d2e7e24 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 8.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:19 compute-1 nova_compute[183083]: 2026-01-26 08:45:19.016 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:45:19 compute-1 nova_compute[183083]: 2026-01-26 08:45:19.020 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:45:19 compute-1 podman[212894]: 2026-01-26 08:45:19.197247014 +0000 UTC m=+0.084205932 container create 2014e17c7ed264286fc301b0e51e3bf8ea8e5da907f9c0924a9b2494c212b51a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 08:45:19 compute-1 podman[212894]: 2026-01-26 08:45:19.135306847 +0000 UTC m=+0.022265775 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 08:45:19 compute-1 systemd[1]: Started libpod-conmon-2014e17c7ed264286fc301b0e51e3bf8ea8e5da907f9c0924a9b2494c212b51a.scope.
Jan 26 08:45:19 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:45:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0891875b93cad0324299f665c0f2d125b0ecf26cf71db91966141bb911ab4dea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 08:45:19 compute-1 podman[212894]: 2026-01-26 08:45:19.289470442 +0000 UTC m=+0.176429400 container init 2014e17c7ed264286fc301b0e51e3bf8ea8e5da907f9c0924a9b2494c212b51a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:45:19 compute-1 podman[212894]: 2026-01-26 08:45:19.296308388 +0000 UTC m=+0.183267296 container start 2014e17c7ed264286fc301b0e51e3bf8ea8e5da907f9c0924a9b2494c212b51a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 26 08:45:19 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[212910]: [NOTICE]   (212914) : New worker (212916) forked
Jan 26 08:45:19 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[212910]: [NOTICE]   (212914) : Loading success.
Jan 26 08:45:19 compute-1 nova_compute[183083]: 2026-01-26 08:45:19.427 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:20 compute-1 nova_compute[183083]: 2026-01-26 08:45:20.614 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:20 compute-1 nova_compute[183083]: 2026-01-26 08:45:20.756 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:21 compute-1 podman[212927]: 2026-01-26 08:45:21.812301091 +0000 UTC m=+0.076096223 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 08:45:21 compute-1 podman[212926]: 2026-01-26 08:45:21.917963605 +0000 UTC m=+0.177401014 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 26 08:45:21 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:21.964 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:06:22 192.168.5.2 2001:5::f816:3eff:feb9:622'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.5.2/24 2001:5::f816:3eff:feb9:622/64', 'neutron:device_id': 'ovnmeta-1231cec5-dd41-4753-9cd0-6832d14fc7ea', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1231cec5-dd41-4753-9cd0-6832d14fc7ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e482dc8c944c4dc1ba301e69d00ec101', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d603658-0302-4651-8142-bd6373c09411, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7febefd9-dcee-4245-a201-0c59d1ec2312) old=Port_Binding(mac=['fa:16:3e:b9:06:22 192.168.5.2'], external_ids={'neutron:cidrs': '192.168.5.2/24', 'neutron:device_id': 'ovnmeta-1231cec5-dd41-4753-9cd0-6832d14fc7ea', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1231cec5-dd41-4753-9cd0-6832d14fc7ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e482dc8c944c4dc1ba301e69d00ec101', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:45:21 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:21.965 104632 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7febefd9-dcee-4245-a201-0c59d1ec2312 in datapath 1231cec5-dd41-4753-9cd0-6832d14fc7ea updated
Jan 26 08:45:21 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:21.969 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1231cec5-dd41-4753-9cd0-6832d14fc7ea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 08:45:21 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:21.972 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[7c5a2146-9c80-4c9d-8ef1-db5c14107867]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:23 compute-1 podman[212974]: 2026-01-26 08:45:23.807745386 +0000 UTC m=+0.067783709 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 08:45:24 compute-1 nova_compute[183083]: 2026-01-26 08:45:24.157 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:24 compute-1 nova_compute[183083]: 2026-01-26 08:45:24.197 183087 DEBUG nova.compute.manager [req-c0b3bd9f-d748-4941-b574-0e7904bd8595 req-e415cab1-5c34-4a3f-bca0-1d83da77cf8b 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received event network-vif-unplugged-fe874242-d6f2-4922-8e79-b6545c0e8446 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:24 compute-1 nova_compute[183083]: 2026-01-26 08:45:24.197 183087 DEBUG oslo_concurrency.lockutils [req-c0b3bd9f-d748-4941-b574-0e7904bd8595 req-e415cab1-5c34-4a3f-bca0-1d83da77cf8b 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:24 compute-1 nova_compute[183083]: 2026-01-26 08:45:24.198 183087 DEBUG oslo_concurrency.lockutils [req-c0b3bd9f-d748-4941-b574-0e7904bd8595 req-e415cab1-5c34-4a3f-bca0-1d83da77cf8b 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:24 compute-1 nova_compute[183083]: 2026-01-26 08:45:24.199 183087 DEBUG oslo_concurrency.lockutils [req-c0b3bd9f-d748-4941-b574-0e7904bd8595 req-e415cab1-5c34-4a3f-bca0-1d83da77cf8b 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:24 compute-1 nova_compute[183083]: 2026-01-26 08:45:24.199 183087 DEBUG nova.compute.manager [req-c0b3bd9f-d748-4941-b574-0e7904bd8595 req-e415cab1-5c34-4a3f-bca0-1d83da77cf8b 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] No waiting events found dispatching network-vif-unplugged-fe874242-d6f2-4922-8e79-b6545c0e8446 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:45:24 compute-1 nova_compute[183083]: 2026-01-26 08:45:24.200 183087 WARNING nova.compute.manager [req-c0b3bd9f-d748-4941-b574-0e7904bd8595 req-e415cab1-5c34-4a3f-bca0-1d83da77cf8b 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received unexpected event network-vif-unplugged-fe874242-d6f2-4922-8e79-b6545c0e8446 for instance with vm_state active and task_state None.
Jan 26 08:45:24 compute-1 nova_compute[183083]: 2026-01-26 08:45:24.435 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:24 compute-1 nova_compute[183083]: 2026-01-26 08:45:24.443 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:25 compute-1 nova_compute[183083]: 2026-01-26 08:45:25.617 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:26 compute-1 nova_compute[183083]: 2026-01-26 08:45:26.353 183087 DEBUG nova.compute.manager [req-96fcd8b6-4c62-4a89-a9b6-d006b942b17f req-6b9416fd-eab5-4c86-bde9-ab716fa34415 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:26 compute-1 nova_compute[183083]: 2026-01-26 08:45:26.355 183087 DEBUG oslo_concurrency.lockutils [req-96fcd8b6-4c62-4a89-a9b6-d006b942b17f req-6b9416fd-eab5-4c86-bde9-ab716fa34415 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:26 compute-1 nova_compute[183083]: 2026-01-26 08:45:26.356 183087 DEBUG oslo_concurrency.lockutils [req-96fcd8b6-4c62-4a89-a9b6-d006b942b17f req-6b9416fd-eab5-4c86-bde9-ab716fa34415 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:26 compute-1 nova_compute[183083]: 2026-01-26 08:45:26.357 183087 DEBUG oslo_concurrency.lockutils [req-96fcd8b6-4c62-4a89-a9b6-d006b942b17f req-6b9416fd-eab5-4c86-bde9-ab716fa34415 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:26 compute-1 nova_compute[183083]: 2026-01-26 08:45:26.357 183087 DEBUG nova.compute.manager [req-96fcd8b6-4c62-4a89-a9b6-d006b942b17f req-6b9416fd-eab5-4c86-bde9-ab716fa34415 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] No waiting events found dispatching network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:45:26 compute-1 nova_compute[183083]: 2026-01-26 08:45:26.358 183087 WARNING nova.compute.manager [req-96fcd8b6-4c62-4a89-a9b6-d006b942b17f req-6b9416fd-eab5-4c86-bde9-ab716fa34415 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received unexpected event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 for instance with vm_state active and task_state None.
Jan 26 08:45:26 compute-1 nova_compute[183083]: 2026-01-26 08:45:26.359 183087 DEBUG nova.compute.manager [req-96fcd8b6-4c62-4a89-a9b6-d006b942b17f req-6b9416fd-eab5-4c86-bde9-ab716fa34415 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:26 compute-1 nova_compute[183083]: 2026-01-26 08:45:26.360 183087 DEBUG oslo_concurrency.lockutils [req-96fcd8b6-4c62-4a89-a9b6-d006b942b17f req-6b9416fd-eab5-4c86-bde9-ab716fa34415 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:26 compute-1 nova_compute[183083]: 2026-01-26 08:45:26.361 183087 DEBUG oslo_concurrency.lockutils [req-96fcd8b6-4c62-4a89-a9b6-d006b942b17f req-6b9416fd-eab5-4c86-bde9-ab716fa34415 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:26 compute-1 nova_compute[183083]: 2026-01-26 08:45:26.361 183087 DEBUG oslo_concurrency.lockutils [req-96fcd8b6-4c62-4a89-a9b6-d006b942b17f req-6b9416fd-eab5-4c86-bde9-ab716fa34415 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:26 compute-1 nova_compute[183083]: 2026-01-26 08:45:26.362 183087 DEBUG nova.compute.manager [req-96fcd8b6-4c62-4a89-a9b6-d006b942b17f req-6b9416fd-eab5-4c86-bde9-ab716fa34415 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] No waiting events found dispatching network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:45:26 compute-1 nova_compute[183083]: 2026-01-26 08:45:26.363 183087 WARNING nova.compute.manager [req-96fcd8b6-4c62-4a89-a9b6-d006b942b17f req-6b9416fd-eab5-4c86-bde9-ab716fa34415 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received unexpected event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 for instance with vm_state active and task_state None.
Jan 26 08:45:26 compute-1 nova_compute[183083]: 2026-01-26 08:45:26.364 183087 DEBUG nova.compute.manager [req-96fcd8b6-4c62-4a89-a9b6-d006b942b17f req-6b9416fd-eab5-4c86-bde9-ab716fa34415 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:26 compute-1 nova_compute[183083]: 2026-01-26 08:45:26.365 183087 DEBUG oslo_concurrency.lockutils [req-96fcd8b6-4c62-4a89-a9b6-d006b942b17f req-6b9416fd-eab5-4c86-bde9-ab716fa34415 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:26 compute-1 nova_compute[183083]: 2026-01-26 08:45:26.365 183087 DEBUG oslo_concurrency.lockutils [req-96fcd8b6-4c62-4a89-a9b6-d006b942b17f req-6b9416fd-eab5-4c86-bde9-ab716fa34415 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:26 compute-1 nova_compute[183083]: 2026-01-26 08:45:26.366 183087 DEBUG oslo_concurrency.lockutils [req-96fcd8b6-4c62-4a89-a9b6-d006b942b17f req-6b9416fd-eab5-4c86-bde9-ab716fa34415 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:26 compute-1 nova_compute[183083]: 2026-01-26 08:45:26.367 183087 DEBUG nova.compute.manager [req-96fcd8b6-4c62-4a89-a9b6-d006b942b17f req-6b9416fd-eab5-4c86-bde9-ab716fa34415 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] No waiting events found dispatching network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:45:26 compute-1 nova_compute[183083]: 2026-01-26 08:45:26.367 183087 WARNING nova.compute.manager [req-96fcd8b6-4c62-4a89-a9b6-d006b942b17f req-6b9416fd-eab5-4c86-bde9-ab716fa34415 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received unexpected event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 for instance with vm_state active and task_state None.
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.029 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.030 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.058 183087 DEBUG nova.compute.manager [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.233 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.234 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.244 183087 DEBUG nova.virt.hardware [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.245 183087 INFO nova.compute.claims [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.402 183087 DEBUG nova.compute.provider_tree [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.420 183087 DEBUG nova.scheduler.client.report [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.442 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.443 183087 DEBUG nova.compute.manager [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.510 183087 DEBUG nova.compute.manager [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.511 183087 DEBUG nova.network.neutron [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.533 183087 INFO nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.551 183087 DEBUG nova.compute.manager [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.758 183087 DEBUG nova.compute.manager [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.760 183087 DEBUG nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.761 183087 INFO nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Creating image(s)
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.762 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "/var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.763 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "/var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.765 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "/var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.792 183087 DEBUG oslo_concurrency.processutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.866 183087 DEBUG oslo_concurrency.processutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.867 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.868 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.884 183087 DEBUG oslo_concurrency.processutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.958 183087 DEBUG oslo_concurrency.processutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:45:27 compute-1 nova_compute[183083]: 2026-01-26 08:45:27.959 183087 DEBUG oslo_concurrency.processutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:45:28 compute-1 nova_compute[183083]: 2026-01-26 08:45:28.014 183087 DEBUG oslo_concurrency.processutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:45:28 compute-1 nova_compute[183083]: 2026-01-26 08:45:28.017 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:28 compute-1 nova_compute[183083]: 2026-01-26 08:45:28.017 183087 DEBUG oslo_concurrency.processutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:45:28 compute-1 nova_compute[183083]: 2026-01-26 08:45:28.097 183087 DEBUG oslo_concurrency.processutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:45:28 compute-1 nova_compute[183083]: 2026-01-26 08:45:28.098 183087 DEBUG nova.virt.disk.api [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Checking if we can resize image /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 08:45:28 compute-1 nova_compute[183083]: 2026-01-26 08:45:28.099 183087 DEBUG oslo_concurrency.processutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:45:28 compute-1 nova_compute[183083]: 2026-01-26 08:45:28.161 183087 DEBUG nova.policy [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:45:28 compute-1 nova_compute[183083]: 2026-01-26 08:45:28.182 183087 DEBUG oslo_concurrency.processutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:45:28 compute-1 nova_compute[183083]: 2026-01-26 08:45:28.183 183087 DEBUG nova.virt.disk.api [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Cannot resize image /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 08:45:28 compute-1 nova_compute[183083]: 2026-01-26 08:45:28.184 183087 DEBUG nova.objects.instance [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lazy-loading 'migration_context' on Instance uuid 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:45:28 compute-1 nova_compute[183083]: 2026-01-26 08:45:28.200 183087 DEBUG nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 08:45:28 compute-1 nova_compute[183083]: 2026-01-26 08:45:28.200 183087 DEBUG nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Ensure instance console log exists: /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 08:45:28 compute-1 nova_compute[183083]: 2026-01-26 08:45:28.201 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:28 compute-1 nova_compute[183083]: 2026-01-26 08:45:28.202 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:28 compute-1 nova_compute[183083]: 2026-01-26 08:45:28.203 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:29 compute-1 nova_compute[183083]: 2026-01-26 08:45:29.440 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:30 compute-1 nova_compute[183083]: 2026-01-26 08:45:30.036 183087 DEBUG nova.network.neutron [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Successfully updated port: 30b45c93-b3bb-44e6-8e4a-6903a631c773 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:45:30 compute-1 nova_compute[183083]: 2026-01-26 08:45:30.153 183087 DEBUG nova.compute.manager [req-242eb2b8-5c02-486a-9d3c-ec2a01655573 req-393b2718-1ee3-4c51-8b9d-51f67256f879 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Received event network-changed-30b45c93-b3bb-44e6-8e4a-6903a631c773 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:30 compute-1 nova_compute[183083]: 2026-01-26 08:45:30.153 183087 DEBUG nova.compute.manager [req-242eb2b8-5c02-486a-9d3c-ec2a01655573 req-393b2718-1ee3-4c51-8b9d-51f67256f879 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Refreshing instance network info cache due to event network-changed-30b45c93-b3bb-44e6-8e4a-6903a631c773. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:45:30 compute-1 nova_compute[183083]: 2026-01-26 08:45:30.154 183087 DEBUG oslo_concurrency.lockutils [req-242eb2b8-5c02-486a-9d3c-ec2a01655573 req-393b2718-1ee3-4c51-8b9d-51f67256f879 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:45:30 compute-1 nova_compute[183083]: 2026-01-26 08:45:30.154 183087 DEBUG oslo_concurrency.lockutils [req-242eb2b8-5c02-486a-9d3c-ec2a01655573 req-393b2718-1ee3-4c51-8b9d-51f67256f879 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:45:30 compute-1 nova_compute[183083]: 2026-01-26 08:45:30.154 183087 DEBUG nova.network.neutron [req-242eb2b8-5c02-486a-9d3c-ec2a01655573 req-393b2718-1ee3-4c51-8b9d-51f67256f879 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Refreshing network info cache for port 30b45c93-b3bb-44e6-8e4a-6903a631c773 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:45:30 compute-1 ovn_controller[95352]: 2026-01-26T08:45:30Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:48:29:f7 10.100.0.24
Jan 26 08:45:30 compute-1 nova_compute[183083]: 2026-01-26 08:45:30.477 183087 DEBUG nova.network.neutron [req-242eb2b8-5c02-486a-9d3c-ec2a01655573 req-393b2718-1ee3-4c51-8b9d-51f67256f879 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:45:30 compute-1 nova_compute[183083]: 2026-01-26 08:45:30.621 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.027 183087 DEBUG nova.network.neutron [req-242eb2b8-5c02-486a-9d3c-ec2a01655573 req-393b2718-1ee3-4c51-8b9d-51f67256f879 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.043 183087 DEBUG oslo_concurrency.lockutils [req-242eb2b8-5c02-486a-9d3c-ec2a01655573 req-393b2718-1ee3-4c51-8b9d-51f67256f879 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.111 183087 DEBUG oslo_concurrency.lockutils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquiring lock "c588f3ea-d2a2-45dc-a12b-77e812835ec3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.112 183087 DEBUG oslo_concurrency.lockutils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "c588f3ea-d2a2-45dc-a12b-77e812835ec3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.128 183087 DEBUG nova.compute.manager [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.195 183087 DEBUG oslo_concurrency.lockutils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.196 183087 DEBUG oslo_concurrency.lockutils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.205 183087 DEBUG nova.virt.hardware [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.205 183087 INFO nova.compute.claims [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.370 183087 DEBUG nova.compute.provider_tree [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.387 183087 DEBUG nova.scheduler.client.report [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.419 183087 DEBUG oslo_concurrency.lockutils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.420 183087 DEBUG nova.compute.manager [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.479 183087 DEBUG nova.compute.manager [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.480 183087 DEBUG nova.network.neutron [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.499 183087 INFO nova.virt.libvirt.driver [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.518 183087 DEBUG nova.compute.manager [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.680 183087 DEBUG nova.compute.manager [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.682 183087 DEBUG nova.virt.libvirt.driver [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.682 183087 INFO nova.virt.libvirt.driver [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Creating image(s)
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.683 183087 DEBUG oslo_concurrency.lockutils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquiring lock "/var/lib/nova/instances/c588f3ea-d2a2-45dc-a12b-77e812835ec3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.683 183087 DEBUG oslo_concurrency.lockutils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "/var/lib/nova/instances/c588f3ea-d2a2-45dc-a12b-77e812835ec3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.684 183087 DEBUG oslo_concurrency.lockutils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "/var/lib/nova/instances/c588f3ea-d2a2-45dc-a12b-77e812835ec3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.684 183087 DEBUG oslo_concurrency.lockutils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.684 183087 DEBUG oslo_concurrency.lockutils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.689 183087 DEBUG nova.network.neutron [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Successfully updated port: 71bea873-e0d4-4d4e-b05e-ff7415434c11 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.716 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "refresh_cache-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.716 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquired lock "refresh_cache-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:45:31 compute-1 nova_compute[183083]: 2026-01-26 08:45:31.716 183087 DEBUG nova.network.neutron [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.014 183087 DEBUG nova.network.neutron [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.030 183087 DEBUG nova.policy [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '64bdc9f771e449a8930bafc62d000e64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e482dc8c944c4dc1ba301e69d00ec101', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.314 183087 DEBUG nova.compute.manager [req-900ac54f-e1d4-4eb6-acd5-43fac047e94e req-b3461967-d356-41d5-a5ee-9ad3a66fd105 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Received event network-changed-71bea873-e0d4-4d4e-b05e-ff7415434c11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.314 183087 DEBUG nova.compute.manager [req-900ac54f-e1d4-4eb6-acd5-43fac047e94e req-b3461967-d356-41d5-a5ee-9ad3a66fd105 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Refreshing instance network info cache due to event network-changed-71bea873-e0d4-4d4e-b05e-ff7415434c11. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.315 183087 DEBUG oslo_concurrency.lockutils [req-900ac54f-e1d4-4eb6-acd5-43fac047e94e req-b3461967-d356-41d5-a5ee-9ad3a66fd105 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.725 183087 DEBUG oslo_concurrency.lockutils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.041s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Traceback (most recent call last):
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]     raise exception.ImageUnacceptable(
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] 
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] During handling of the above exception, another exception occurred:
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] 
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Traceback (most recent call last):
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]     yield resources
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]     created_disks = self._create_and_inject_local_root(
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]     image.cache(fetch_func=fetch_func,
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]     return f(*args, **kwargs)
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3]     raise exception.ImageUnacceptable(
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:45:32 compute-1 nova_compute[183083]: 2026-01-26 08:45:32.726 183087 ERROR nova.compute.manager [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] 
Jan 26 08:45:33 compute-1 nova_compute[183083]: 2026-01-26 08:45:33.320 183087 DEBUG oslo_concurrency.lockutils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquiring lock "31c86c5f-fd35-45f8-af2c-165a04ff46dc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:33 compute-1 nova_compute[183083]: 2026-01-26 08:45:33.321 183087 DEBUG oslo_concurrency.lockutils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "31c86c5f-fd35-45f8-af2c-165a04ff46dc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:33 compute-1 nova_compute[183083]: 2026-01-26 08:45:33.350 183087 DEBUG nova.compute.manager [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:45:33 compute-1 nova_compute[183083]: 2026-01-26 08:45:33.450 183087 DEBUG oslo_concurrency.lockutils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:33 compute-1 nova_compute[183083]: 2026-01-26 08:45:33.450 183087 DEBUG oslo_concurrency.lockutils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:33 compute-1 nova_compute[183083]: 2026-01-26 08:45:33.458 183087 DEBUG nova.virt.hardware [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:45:33 compute-1 nova_compute[183083]: 2026-01-26 08:45:33.458 183087 INFO nova.compute.claims [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:45:33 compute-1 nova_compute[183083]: 2026-01-26 08:45:33.654 183087 DEBUG nova.compute.provider_tree [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:45:33 compute-1 nova_compute[183083]: 2026-01-26 08:45:33.673 183087 DEBUG nova.scheduler.client.report [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:45:33 compute-1 podman[213014]: 2026-01-26 08:45:33.797921409 +0000 UTC m=+0.067127332 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 08:45:33 compute-1 nova_compute[183083]: 2026-01-26 08:45:33.993 183087 DEBUG oslo_concurrency.lockutils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.543s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:33 compute-1 nova_compute[183083]: 2026-01-26 08:45:33.994 183087 DEBUG nova.compute.manager [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:45:34 compute-1 nova_compute[183083]: 2026-01-26 08:45:34.066 183087 DEBUG nova.compute.manager [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:45:34 compute-1 nova_compute[183083]: 2026-01-26 08:45:34.066 183087 DEBUG nova.network.neutron [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:45:34 compute-1 nova_compute[183083]: 2026-01-26 08:45:34.093 183087 INFO nova.virt.libvirt.driver [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:45:34 compute-1 nova_compute[183083]: 2026-01-26 08:45:34.114 183087 DEBUG nova.compute.manager [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:45:34 compute-1 nova_compute[183083]: 2026-01-26 08:45:34.206 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:34 compute-1 nova_compute[183083]: 2026-01-26 08:45:34.258 183087 DEBUG nova.compute.manager [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:45:34 compute-1 nova_compute[183083]: 2026-01-26 08:45:34.260 183087 DEBUG nova.virt.libvirt.driver [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:45:34 compute-1 nova_compute[183083]: 2026-01-26 08:45:34.260 183087 INFO nova.virt.libvirt.driver [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Creating image(s)
Jan 26 08:45:34 compute-1 nova_compute[183083]: 2026-01-26 08:45:34.261 183087 DEBUG oslo_concurrency.lockutils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquiring lock "/var/lib/nova/instances/31c86c5f-fd35-45f8-af2c-165a04ff46dc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:34 compute-1 nova_compute[183083]: 2026-01-26 08:45:34.262 183087 DEBUG oslo_concurrency.lockutils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "/var/lib/nova/instances/31c86c5f-fd35-45f8-af2c-165a04ff46dc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:34 compute-1 nova_compute[183083]: 2026-01-26 08:45:34.263 183087 DEBUG oslo_concurrency.lockutils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "/var/lib/nova/instances/31c86c5f-fd35-45f8-af2c-165a04ff46dc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:34 compute-1 nova_compute[183083]: 2026-01-26 08:45:34.263 183087 DEBUG oslo_concurrency.lockutils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:34 compute-1 nova_compute[183083]: 2026-01-26 08:45:34.266 183087 DEBUG oslo_concurrency.lockutils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:34 compute-1 nova_compute[183083]: 2026-01-26 08:45:34.444 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.041 183087 DEBUG nova.policy [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '41a09f4d7f034b1c85f20c9512d33411', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71cced1777f24868932d789154ff04a0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.197 183087 DEBUG nova.network.neutron [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Successfully updated port: eb474fb9-2733-48b3-b52a-cdbefbfaa0b4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.219 183087 DEBUG oslo_concurrency.lockutils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquiring lock "refresh_cache-c588f3ea-d2a2-45dc-a12b-77e812835ec3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.219 183087 DEBUG oslo_concurrency.lockutils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquired lock "refresh_cache-c588f3ea-d2a2-45dc-a12b-77e812835ec3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.220 183087 DEBUG nova.network.neutron [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 DEBUG oslo_concurrency.lockutils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.049s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Traceback (most recent call last):
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]     raise exception.ImageUnacceptable(
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] 
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] During handling of the above exception, another exception occurred:
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] 
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Traceback (most recent call last):
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]     yield resources
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]     created_disks = self._create_and_inject_local_root(
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]     image.cache(fetch_func=fetch_func,
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]     return f(*args, **kwargs)
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc]     raise exception.ImageUnacceptable(
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.316 183087 ERROR nova.compute.manager [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] 
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.371 183087 DEBUG nova.network.neutron [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Updating instance_info_cache with network_info: [{"id": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "address": "fa:16:3e:dd:ab:11", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b45c93-b3", "ovs_interfaceid": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "address": "fa:16:3e:81:c4:17", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bea873-e0", "ovs_interfaceid": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.399 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Releasing lock "refresh_cache-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.399 183087 DEBUG nova.compute.manager [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Instance network_info: |[{"id": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "address": "fa:16:3e:dd:ab:11", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b45c93-b3", "ovs_interfaceid": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "address": "fa:16:3e:81:c4:17", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bea873-e0", "ovs_interfaceid": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.399 183087 DEBUG oslo_concurrency.lockutils [req-900ac54f-e1d4-4eb6-acd5-43fac047e94e req-b3461967-d356-41d5-a5ee-9ad3a66fd105 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.400 183087 DEBUG nova.network.neutron [req-900ac54f-e1d4-4eb6-acd5-43fac047e94e req-b3461967-d356-41d5-a5ee-9ad3a66fd105 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Refreshing network info cache for port 71bea873-e0d4-4d4e-b05e-ff7415434c11 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.406 183087 DEBUG nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Start _get_guest_xml network_info=[{"id": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "address": "fa:16:3e:dd:ab:11", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b45c93-b3", "ovs_interfaceid": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "address": "fa:16:3e:81:c4:17", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bea873-e0", "ovs_interfaceid": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encryption_options': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.415 183087 WARNING nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.428 183087 DEBUG nova.virt.libvirt.host [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.429 183087 DEBUG nova.virt.libvirt.host [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.433 183087 DEBUG nova.virt.libvirt.host [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.434 183087 DEBUG nova.virt.libvirt.host [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.434 183087 DEBUG nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.435 183087 DEBUG nova.virt.hardware [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T08:43:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7017323-cd35-470d-a718-faa6c6e97277',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.435 183087 DEBUG nova.virt.hardware [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.435 183087 DEBUG nova.virt.hardware [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.436 183087 DEBUG nova.virt.hardware [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.436 183087 DEBUG nova.virt.hardware [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.436 183087 DEBUG nova.virt.hardware [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.437 183087 DEBUG nova.virt.hardware [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.437 183087 DEBUG nova.virt.hardware [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.437 183087 DEBUG nova.virt.hardware [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.438 183087 DEBUG nova.virt.hardware [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.438 183087 DEBUG nova.virt.hardware [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.443 183087 DEBUG nova.virt.libvirt.vif [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:45:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1154658508',display_name='tempest-server-test-1154658508',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1154658508',id=11,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBONV6jGPWIKrQk26JKZ8H9h2iNqZUCOpmao7Jq+9fEiq3iYmdJdZreCUr9V3PbbF1TPAOou07OOnfHkvbrEzcfNM5ieiMGZPqHbawuPIe3wilad9S814UZ1oxvh/DW+nZg==',key_name='tempest-keypair-test-867467517',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4a559c36b13649d98b2995c099340eb9',ramdisk_id='',reservation_id='r-lygcy6jy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-508365101',owner_user_name='tempest-PortSecurityTest-508365101-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:45:27Z,user_data=None,user_id='52d582094c584036ba3e04c9da69ee02',uuid=7fb5f66c-db87-49bd-8c08-1c21b7ea58e8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "address": "fa:16:3e:dd:ab:11", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b45c93-b3", "ovs_interfaceid": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.444 183087 DEBUG nova.network.os_vif_util [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converting VIF {"id": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "address": "fa:16:3e:dd:ab:11", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b45c93-b3", "ovs_interfaceid": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.444 183087 DEBUG nova.network.os_vif_util [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:ab:11,bridge_name='br-int',has_traffic_filtering=True,id=30b45c93-b3bb-44e6-8e4a-6903a631c773,network=Network(bad39ade-29c7-41d5-89dd-fc1845e5f3f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap30b45c93-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.446 183087 DEBUG nova.virt.libvirt.vif [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:45:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1154658508',display_name='tempest-server-test-1154658508',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1154658508',id=11,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBONV6jGPWIKrQk26JKZ8H9h2iNqZUCOpmao7Jq+9fEiq3iYmdJdZreCUr9V3PbbF1TPAOou07OOnfHkvbrEzcfNM5ieiMGZPqHbawuPIe3wilad9S814UZ1oxvh/DW+nZg==',key_name='tempest-keypair-test-867467517',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4a559c36b13649d98b2995c099340eb9',ramdisk_id='',reservation_id='r-lygcy6jy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-508365101',owner_user_name='tempest-PortSecurityTest-508365101-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:45:27Z,user_data=None,user_id='52d582094c584036ba3e04c9da69ee02',uuid=7fb5f66c-db87-49bd-8c08-1c21b7ea58e8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "address": "fa:16:3e:81:c4:17", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bea873-e0", "ovs_interfaceid": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.446 183087 DEBUG nova.network.os_vif_util [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converting VIF {"id": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "address": "fa:16:3e:81:c4:17", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bea873-e0", "ovs_interfaceid": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.447 183087 DEBUG nova.network.os_vif_util [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:c4:17,bridge_name='br-int',has_traffic_filtering=True,id=71bea873-e0d4-4d4e-b05e-ff7415434c11,network=Network(410ad2c8-60c1-40d5-855c-7deeb749f0fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap71bea873-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.448 183087 DEBUG nova.objects.instance [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.462 183087 DEBUG nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] End _get_guest_xml xml=<domain type="kvm">
Jan 26 08:45:35 compute-1 nova_compute[183083]:   <uuid>7fb5f66c-db87-49bd-8c08-1c21b7ea58e8</uuid>
Jan 26 08:45:35 compute-1 nova_compute[183083]:   <name>instance-0000000b</name>
Jan 26 08:45:35 compute-1 nova_compute[183083]:   <memory>131072</memory>
Jan 26 08:45:35 compute-1 nova_compute[183083]:   <vcpu>1</vcpu>
Jan 26 08:45:35 compute-1 nova_compute[183083]:   <metadata>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <nova:name>tempest-server-test-1154658508</nova:name>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <nova:creationTime>2026-01-26 08:45:35</nova:creationTime>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <nova:flavor name="m1.nano">
Jan 26 08:45:35 compute-1 nova_compute[183083]:         <nova:memory>128</nova:memory>
Jan 26 08:45:35 compute-1 nova_compute[183083]:         <nova:disk>1</nova:disk>
Jan 26 08:45:35 compute-1 nova_compute[183083]:         <nova:swap>0</nova:swap>
Jan 26 08:45:35 compute-1 nova_compute[183083]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 08:45:35 compute-1 nova_compute[183083]:         <nova:vcpus>1</nova:vcpus>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       </nova:flavor>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <nova:owner>
Jan 26 08:45:35 compute-1 nova_compute[183083]:         <nova:user uuid="52d582094c584036ba3e04c9da69ee02">tempest-PortSecurityTest-508365101-project-member</nova:user>
Jan 26 08:45:35 compute-1 nova_compute[183083]:         <nova:project uuid="4a559c36b13649d98b2995c099340eb9">tempest-PortSecurityTest-508365101</nova:project>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       </nova:owner>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <nova:root type="image" uuid="13d1a20a-8003-4f19-aba7-ccbd9eff9b82"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <nova:ports>
Jan 26 08:45:35 compute-1 nova_compute[183083]:         <nova:port uuid="30b45c93-b3bb-44e6-8e4a-6903a631c773">
Jan 26 08:45:35 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="192.168.0.175" ipVersion="4"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 08:45:35 compute-1 nova_compute[183083]:         <nova:port uuid="71bea873-e0d4-4d4e-b05e-ff7415434c11">
Jan 26 08:45:35 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="192.168.1.11" ipVersion="4"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       </nova:ports>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     </nova:instance>
Jan 26 08:45:35 compute-1 nova_compute[183083]:   </metadata>
Jan 26 08:45:35 compute-1 nova_compute[183083]:   <sysinfo type="smbios">
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <system>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <entry name="manufacturer">RDO</entry>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <entry name="product">OpenStack Compute</entry>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <entry name="serial">7fb5f66c-db87-49bd-8c08-1c21b7ea58e8</entry>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <entry name="uuid">7fb5f66c-db87-49bd-8c08-1c21b7ea58e8</entry>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <entry name="family">Virtual Machine</entry>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     </system>
Jan 26 08:45:35 compute-1 nova_compute[183083]:   </sysinfo>
Jan 26 08:45:35 compute-1 nova_compute[183083]:   <os>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <boot dev="hd"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <smbios mode="sysinfo"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:   </os>
Jan 26 08:45:35 compute-1 nova_compute[183083]:   <features>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <acpi/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <apic/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <vmcoreinfo/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:   </features>
Jan 26 08:45:35 compute-1 nova_compute[183083]:   <clock offset="utc">
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <timer name="hpet" present="no"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:   </clock>
Jan 26 08:45:35 compute-1 nova_compute[183083]:   <cpu mode="host-model" match="exact">
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:   </cpu>
Jan 26 08:45:35 compute-1 nova_compute[183083]:   <devices>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <disk type="file" device="disk">
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <target dev="vda" bus="virtio"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <disk type="file" device="cdrom">
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.config"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <target dev="sda" bus="sata"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:dd:ab:11"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <target dev="tap30b45c93-b3"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:81:c4:17"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <target dev="tap71bea873-e0"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <serial type="pty">
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <log file="/var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/console.log" append="off"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     </serial>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <video>
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     </video>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <input type="tablet" bus="usb"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <rng model="virtio">
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <backend model="random">/dev/urandom</backend>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     </rng>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <controller type="usb" index="0"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     <memballoon model="virtio">
Jan 26 08:45:35 compute-1 nova_compute[183083]:       <stats period="10"/>
Jan 26 08:45:35 compute-1 nova_compute[183083]:     </memballoon>
Jan 26 08:45:35 compute-1 nova_compute[183083]:   </devices>
Jan 26 08:45:35 compute-1 nova_compute[183083]: </domain>
Jan 26 08:45:35 compute-1 nova_compute[183083]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.463 183087 DEBUG nova.compute.manager [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Preparing to wait for external event network-vif-plugged-30b45c93-b3bb-44e6-8e4a-6903a631c773 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.463 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.463 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.464 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.464 183087 DEBUG nova.compute.manager [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Preparing to wait for external event network-vif-plugged-71bea873-e0d4-4d4e-b05e-ff7415434c11 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.464 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.464 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.465 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.466 183087 DEBUG nova.virt.libvirt.vif [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:45:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1154658508',display_name='tempest-server-test-1154658508',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1154658508',id=11,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBONV6jGPWIKrQk26JKZ8H9h2iNqZUCOpmao7Jq+9fEiq3iYmdJdZreCUr9V3PbbF1TPAOou07OOnfHkvbrEzcfNM5ieiMGZPqHbawuPIe3wilad9S814UZ1oxvh/DW+nZg==',key_name='tempest-keypair-test-867467517',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4a559c36b13649d98b2995c099340eb9',ramdisk_id='',reservation_id='r-lygcy6jy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-508365101',owner_user_name='tempest-PortSecurityTest-508365101-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:45:27Z,user_data=None,user_id='52d582094c584036ba3e04c9da69ee02',uuid=7fb5f66c-db87-49bd-8c08-1c21b7ea58e8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "address": "fa:16:3e:dd:ab:11", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b45c93-b3", "ovs_interfaceid": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.466 183087 DEBUG nova.network.os_vif_util [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converting VIF {"id": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "address": "fa:16:3e:dd:ab:11", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b45c93-b3", "ovs_interfaceid": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.467 183087 DEBUG nova.network.os_vif_util [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:ab:11,bridge_name='br-int',has_traffic_filtering=True,id=30b45c93-b3bb-44e6-8e4a-6903a631c773,network=Network(bad39ade-29c7-41d5-89dd-fc1845e5f3f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap30b45c93-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.467 183087 DEBUG os_vif [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:ab:11,bridge_name='br-int',has_traffic_filtering=True,id=30b45c93-b3bb-44e6-8e4a-6903a631c773,network=Network(bad39ade-29c7-41d5-89dd-fc1845e5f3f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap30b45c93-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.468 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.468 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.469 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.472 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.473 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap30b45c93-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.473 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap30b45c93-b3, col_values=(('external_ids', {'iface-id': '30b45c93-b3bb-44e6-8e4a-6903a631c773', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:ab:11', 'vm-uuid': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.475 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.477 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:45:35 compute-1 NetworkManager[55451]: <info>  [1769417135.4779] manager: (tap30b45c93-b3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.487 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.488 183087 INFO os_vif [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:ab:11,bridge_name='br-int',has_traffic_filtering=True,id=30b45c93-b3bb-44e6-8e4a-6903a631c773,network=Network(bad39ade-29c7-41d5-89dd-fc1845e5f3f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap30b45c93-b3')
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.489 183087 DEBUG nova.virt.libvirt.vif [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:45:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1154658508',display_name='tempest-server-test-1154658508',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1154658508',id=11,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBONV6jGPWIKrQk26JKZ8H9h2iNqZUCOpmao7Jq+9fEiq3iYmdJdZreCUr9V3PbbF1TPAOou07OOnfHkvbrEzcfNM5ieiMGZPqHbawuPIe3wilad9S814UZ1oxvh/DW+nZg==',key_name='tempest-keypair-test-867467517',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4a559c36b13649d98b2995c099340eb9',ramdisk_id='',reservation_id='r-lygcy6jy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-508365101',owner_user_name='tempest-PortSecurityTest-508365101-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:45:27Z,user_data=None,user_id='52d582094c584036ba3e04c9da69ee02',uuid=7fb5f66c-db87-49bd-8c08-1c21b7ea58e8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "address": "fa:16:3e:81:c4:17", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bea873-e0", "ovs_interfaceid": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.490 183087 DEBUG nova.network.os_vif_util [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converting VIF {"id": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "address": "fa:16:3e:81:c4:17", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bea873-e0", "ovs_interfaceid": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.490 183087 DEBUG nova.network.os_vif_util [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:c4:17,bridge_name='br-int',has_traffic_filtering=True,id=71bea873-e0d4-4d4e-b05e-ff7415434c11,network=Network(410ad2c8-60c1-40d5-855c-7deeb749f0fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap71bea873-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.491 183087 DEBUG os_vif [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:c4:17,bridge_name='br-int',has_traffic_filtering=True,id=71bea873-e0d4-4d4e-b05e-ff7415434c11,network=Network(410ad2c8-60c1-40d5-855c-7deeb749f0fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap71bea873-e0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.492 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.492 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.492 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.494 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.495 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap71bea873-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.495 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap71bea873-e0, col_values=(('external_ids', {'iface-id': '71bea873-e0d4-4d4e-b05e-ff7415434c11', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:81:c4:17', 'vm-uuid': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.497 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:35 compute-1 NetworkManager[55451]: <info>  [1769417135.4989] manager: (tap71bea873-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.500 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.507 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.508 183087 INFO os_vif [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:c4:17,bridge_name='br-int',has_traffic_filtering=True,id=71bea873-e0d4-4d4e-b05e-ff7415434c11,network=Network(410ad2c8-60c1-40d5-855c-7deeb749f0fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap71bea873-e0')
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.582 183087 DEBUG nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.583 183087 DEBUG nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.583 183087 DEBUG nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] No VIF found with MAC fa:16:3e:dd:ab:11, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.584 183087 DEBUG nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] No VIF found with MAC fa:16:3e:81:c4:17, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.584 183087 INFO nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Using config drive
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.780 183087 DEBUG nova.network.neutron [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.830 183087 INFO nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Creating config drive at /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.config
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.839 183087 DEBUG oslo_concurrency.processutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg85cwds3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.937 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:35 compute-1 nova_compute[183083]: 2026-01-26 08:45:35.970 183087 DEBUG oslo_concurrency.processutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg85cwds3" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:45:36 compute-1 kernel: tap30b45c93-b3: entered promiscuous mode
Jan 26 08:45:36 compute-1 NetworkManager[55451]: <info>  [1769417136.0342] manager: (tap30b45c93-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.040 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:36 compute-1 ovn_controller[95352]: 2026-01-26T08:45:36Z|00056|binding|INFO|Claiming lport 30b45c93-b3bb-44e6-8e4a-6903a631c773 for this chassis.
Jan 26 08:45:36 compute-1 ovn_controller[95352]: 2026-01-26T08:45:36Z|00057|binding|INFO|30b45c93-b3bb-44e6-8e4a-6903a631c773: Claiming fa:16:3e:dd:ab:11 192.168.0.175
Jan 26 08:45:36 compute-1 ovn_controller[95352]: 2026-01-26T08:45:36Z|00058|binding|INFO|30b45c93-b3bb-44e6-8e4a-6903a631c773: Claiming unknown
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.050 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:ab:11 192.168.0.175', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.175/24', 'neutron:device_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bad39ade-29c7-41d5-89dd-fc1845e5f3f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4a559c36b13649d98b2995c099340eb9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7264cb73-6ef7-4995-bc02-8c0dee738bd8, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=30b45c93-b3bb-44e6-8e4a-6903a631c773) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.052 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 30b45c93-b3bb-44e6-8e4a-6903a631c773 in datapath bad39ade-29c7-41d5-89dd-fc1845e5f3f2 bound to our chassis
Jan 26 08:45:36 compute-1 NetworkManager[55451]: <info>  [1769417136.0548] manager: (tap71bea873-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.057 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:36 compute-1 kernel: tap71bea873-e0: entered promiscuous mode
Jan 26 08:45:36 compute-1 ovn_controller[95352]: 2026-01-26T08:45:36Z|00059|binding|INFO|Setting lport 30b45c93-b3bb-44e6-8e4a-6903a631c773 up in Southbound
Jan 26 08:45:36 compute-1 ovn_controller[95352]: 2026-01-26T08:45:36Z|00060|binding|INFO|Setting lport 30b45c93-b3bb-44e6-8e4a-6903a631c773 ovn-installed in OVS
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.057 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bad39ade-29c7-41d5-89dd-fc1845e5f3f2
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.058 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.059 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:36 compute-1 ovn_controller[95352]: 2026-01-26T08:45:36Z|00061|binding|INFO|Claiming lport 71bea873-e0d4-4d4e-b05e-ff7415434c11 for this chassis.
Jan 26 08:45:36 compute-1 ovn_controller[95352]: 2026-01-26T08:45:36Z|00062|binding|INFO|71bea873-e0d4-4d4e-b05e-ff7415434c11: Claiming fa:16:3e:81:c4:17 192.168.1.11
Jan 26 08:45:36 compute-1 ovn_controller[95352]: 2026-01-26T08:45:36Z|00063|binding|INFO|71bea873-e0d4-4d4e-b05e-ff7415434c11: Claiming unknown
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.074 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[dbc69ccc-5176-4c91-94eb-79ffabc0ba40]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.075 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbad39ade-21 in ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 08:45:36 compute-1 systemd-udevd[213062]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:45:36 compute-1 systemd-udevd[213063]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.081 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:c4:17 192.168.1.11', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.1.11/24', 'neutron:device_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-410ad2c8-60c1-40d5-855c-7deeb749f0fe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4a559c36b13649d98b2995c099340eb9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db99b112-04a6-4be6-8e9e-7db1f7ce0209, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=71bea873-e0d4-4d4e-b05e-ff7415434c11) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.079 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbad39ade-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.079 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[65399c3a-238b-4d4c-97fb-830d3ce4247d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.086 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[ed941228-2c79-4752-9fc2-849c00d9df92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:36 compute-1 NetworkManager[55451]: <info>  [1769417136.0900] device (tap71bea873-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 08:45:36 compute-1 NetworkManager[55451]: <info>  [1769417136.0911] device (tap30b45c93-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 08:45:36 compute-1 NetworkManager[55451]: <info>  [1769417136.0921] device (tap71bea873-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 08:45:36 compute-1 NetworkManager[55451]: <info>  [1769417136.0927] device (tap30b45c93-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.098 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[1820393c-5afc-457b-9d0b-c7430ced8b1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:36 compute-1 systemd-machined[154360]: New machine qemu-3-instance-0000000b.
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.124 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.126 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[24bbaf48-b98b-492a-bc26-3ce213848394]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:36 compute-1 ovn_controller[95352]: 2026-01-26T08:45:36Z|00064|binding|INFO|Setting lport 71bea873-e0d4-4d4e-b05e-ff7415434c11 ovn-installed in OVS
Jan 26 08:45:36 compute-1 ovn_controller[95352]: 2026-01-26T08:45:36Z|00065|binding|INFO|Setting lport 71bea873-e0d4-4d4e-b05e-ff7415434c11 up in Southbound
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.128 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:36 compute-1 systemd[1]: Started Virtual Machine qemu-3-instance-0000000b.
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.160 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[7e3c3c7f-8231-4694-9f9b-e7a1deea3895]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.167 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[a1ee1a7f-146e-4d93-8294-e8a0a629c8d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:36 compute-1 NetworkManager[55451]: <info>  [1769417136.1697] manager: (tapbad39ade-20): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.212 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[b041ef00-aa50-405a-9e4a-5f3cc02c8132]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.214 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[0ecab548-c3d4-4bc7-b305-74fc33b4e248]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:36 compute-1 NetworkManager[55451]: <info>  [1769417136.2377] device (tapbad39ade-20): carrier: link connected
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.244 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[fac511ff-a1a9-4d70-b8e7-819862544fe2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.267 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[8d992be1-53b6-4d20-bebe-8605043e0a4f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbad39ade-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:84:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347684, 'reachable_time': 21006, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213101, 'error': None, 'target': 'ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.287 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b794c949-a3e8-4991-95e0-f25d3446d96c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:8447'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347684, 'tstamp': 347684}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213102, 'error': None, 'target': 'ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.310 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[2f77dabf-f3a5-4d8f-8382-d6fa82fd44c3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbad39ade-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:84:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347684, 'reachable_time': 21006, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213103, 'error': None, 'target': 'ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.344 183087 DEBUG nova.compute.manager [req-73069f2a-1aaa-44de-addb-e9bc228ca290 req-7a0301ff-7070-4d30-8307-559ffddf9ba8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Received event network-changed-eb474fb9-2733-48b3-b52a-cdbefbfaa0b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.345 183087 DEBUG nova.compute.manager [req-73069f2a-1aaa-44de-addb-e9bc228ca290 req-7a0301ff-7070-4d30-8307-559ffddf9ba8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Refreshing instance network info cache due to event network-changed-eb474fb9-2733-48b3-b52a-cdbefbfaa0b4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.345 183087 DEBUG oslo_concurrency.lockutils [req-73069f2a-1aaa-44de-addb-e9bc228ca290 req-7a0301ff-7070-4d30-8307-559ffddf9ba8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-c588f3ea-d2a2-45dc-a12b-77e812835ec3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.350 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[cdf8fe16-e884-4ccf-ac0b-2f273e776e61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.413 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[0ea76621-3231-4b6b-b160-903aba845fe7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.415 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbad39ade-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.415 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.416 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbad39ade-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:36 compute-1 kernel: tapbad39ade-20: entered promiscuous mode
Jan 26 08:45:36 compute-1 NetworkManager[55451]: <info>  [1769417136.4494] manager: (tapbad39ade-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.450 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.452 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbad39ade-20, col_values=(('external_ids', {'iface-id': '809259ab-8ea4-4909-92b4-4ee536a51482'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.453 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:36 compute-1 ovn_controller[95352]: 2026-01-26T08:45:36Z|00066|binding|INFO|Releasing lport 809259ab-8ea4-4909-92b4-4ee536a51482 from this chassis (sb_readonly=0)
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.454 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.455 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bad39ade-29c7-41d5-89dd-fc1845e5f3f2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bad39ade-29c7-41d5-89dd-fc1845e5f3f2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.456 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[f6b22bab-d62a-4101-b1b2-b381210f2e88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.457 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: global
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-bad39ade-29c7-41d5-89dd-fc1845e5f3f2
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/bad39ade-29c7-41d5-89dd-fc1845e5f3f2.pid.haproxy
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID bad39ade-29c7-41d5-89dd-fc1845e5f3f2
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 08:45:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:36.458 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2', 'env', 'PROCESS_TAG=haproxy-bad39ade-29c7-41d5-89dd-fc1845e5f3f2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bad39ade-29c7-41d5-89dd-fc1845e5f3f2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.464 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:36 compute-1 podman[213136]: 2026-01-26 08:45:36.872954844 +0000 UTC m=+0.066299480 container create 43d5fb1d3c43f170d0f894c9c62be394998e190bd972dedaad238e2ede3e095c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.927 183087 DEBUG oslo_concurrency.lockutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquiring lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.927 183087 DEBUG oslo_concurrency.lockutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.928 183087 INFO nova.compute.manager [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Rebooting instance
Jan 26 08:45:36 compute-1 podman[213136]: 2026-01-26 08:45:36.833667082 +0000 UTC m=+0.027011718 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 08:45:36 compute-1 systemd[1]: Started libpod-conmon-43d5fb1d3c43f170d0f894c9c62be394998e190bd972dedaad238e2ede3e095c.scope.
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.948 183087 DEBUG oslo_concurrency.lockutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquiring lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.949 183087 DEBUG oslo_concurrency.lockutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquired lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:45:36 compute-1 nova_compute[183083]: 2026-01-26 08:45:36.949 183087 DEBUG nova.network.neutron [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:45:36 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:45:36 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ff6c543088ae657c42fba9e9a5abc3bede918a7d8f9abb8b05ec5694f65febd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 08:45:36 compute-1 podman[213136]: 2026-01-26 08:45:36.983535975 +0000 UTC m=+0.176880651 container init 43d5fb1d3c43f170d0f894c9c62be394998e190bd972dedaad238e2ede3e095c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 26 08:45:36 compute-1 podman[213136]: 2026-01-26 08:45:36.98954507 +0000 UTC m=+0.182889706 container start 43d5fb1d3c43f170d0f894c9c62be394998e190bd972dedaad238e2ede3e095c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:45:37 compute-1 neutron-haproxy-ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2[213151]: [NOTICE]   (213155) : New worker (213157) forked
Jan 26 08:45:37 compute-1 neutron-haproxy-ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2[213151]: [NOTICE]   (213155) : Loading success.
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.063 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 71bea873-e0d4-4d4e-b05e-ff7415434c11 in datapath 410ad2c8-60c1-40d5-855c-7deeb749f0fe unbound from our chassis
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.070 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 410ad2c8-60c1-40d5-855c-7deeb749f0fe
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.084 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[ec98c820-8bd3-4918-a17e-f19299dd6aed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.085 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap410ad2c8-61 in ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.087 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap410ad2c8-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.088 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[fc212202-654e-4803-ae75-0cafb4f240da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.089 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[71726d79-9384-46ba-b3e2-3fcf76ac64e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.105 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[abac814a-50e5-4798-afd1-763fb3d349d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.121 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[0a555ac9-d764-4ec2-ae7d-5b9260254c38]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.143 183087 DEBUG nova.network.neutron [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Successfully updated port: bde3b3f7-f517-429f-a05a-4c2dd76dc0ae _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.165 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[a18a01c9-7aa5-434e-a8cc-62a358469f8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.169 183087 DEBUG oslo_concurrency.lockutils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquiring lock "refresh_cache-31c86c5f-fd35-45f8-af2c-165a04ff46dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.170 183087 DEBUG oslo_concurrency.lockutils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquired lock "refresh_cache-31c86c5f-fd35-45f8-af2c-165a04ff46dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.170 183087 DEBUG nova.network.neutron [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:45:37 compute-1 NetworkManager[55451]: <info>  [1769417137.1749] manager: (tap410ad2c8-60): new Veth device (/org/freedesktop/NetworkManager/Devices/37)
Jan 26 08:45:37 compute-1 systemd-udevd[213087]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.173 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[f352c5fe-3ca3-4868-be1f-1efc36c80426]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.227 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[1bdff7c1-3e99-4194-b201-c46b95ba4b99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.232 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[63935fc4-717d-4b8b-9a5e-5d7d260eb0ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:37 compute-1 NetworkManager[55451]: <info>  [1769417137.2691] device (tap410ad2c8-60): carrier: link connected
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.278 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[87a9e3df-0212-45ae-af21-ed793ee50e10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.304 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[7a6be07a-223b-45f5-8f6b-fde88eb0749e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap410ad2c8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:bf:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347787, 'reachable_time': 26714, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213176, 'error': None, 'target': 'ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.326 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[cb365269-3bb6-4e74-8425-7979f676e4dc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb0:bf54'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347787, 'tstamp': 347787}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213177, 'error': None, 'target': 'ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.351 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[5392778a-f704-4405-9130-f7b08f1dac46]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap410ad2c8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:bf:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347787, 'reachable_time': 26714, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213178, 'error': None, 'target': 'ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.397 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[33ec807e-739d-4ed5-b52c-6e35ebd28822]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.491 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[764bf78c-30b8-4f79-a2a4-22877ec6d743]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.493 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap410ad2c8-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.493 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.494 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap410ad2c8-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.496 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:37 compute-1 NetworkManager[55451]: <info>  [1769417137.4974] manager: (tap410ad2c8-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Jan 26 08:45:37 compute-1 kernel: tap410ad2c8-60: entered promiscuous mode
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.499 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.500 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap410ad2c8-60, col_values=(('external_ids', {'iface-id': 'bfb9744a-58f0-4145-9e28-9c13225b3407'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:37 compute-1 ovn_controller[95352]: 2026-01-26T08:45:37Z|00067|binding|INFO|Releasing lport bfb9744a-58f0-4145-9e28-9c13225b3407 from this chassis (sb_readonly=0)
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.502 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.525 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.527 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/410ad2c8-60c1-40d5-855c-7deeb749f0fe.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/410ad2c8-60c1-40d5-855c-7deeb749f0fe.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.528 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b02e095b-fa7c-4072-8806-4b6e8a90eac4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.529 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: global
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-410ad2c8-60c1-40d5-855c-7deeb749f0fe
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/410ad2c8-60c1-40d5-855c-7deeb749f0fe.pid.haproxy
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID 410ad2c8-60c1-40d5-855c-7deeb749f0fe
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 08:45:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:37.530 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe', 'env', 'PROCESS_TAG=haproxy-410ad2c8-60c1-40d5-855c-7deeb749f0fe', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/410ad2c8-60c1-40d5-855c-7deeb749f0fe.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.855 183087 DEBUG nova.network.neutron [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Updating instance_info_cache with network_info: [{"id": "eb474fb9-2733-48b3-b52a-cdbefbfaa0b4", "address": "fa:16:3e:03:a4:99", "network": {"id": "1231cec5-dd41-4753-9cd0-6832d14fc7ea", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::f816:3eff:fe03:a499", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.212", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb474fb9-27", "ovs_interfaceid": "eb474fb9-2733-48b3-b52a-cdbefbfaa0b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.869 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417137.8686283, 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.869 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] VM Started (Lifecycle Event)
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.881 183087 DEBUG oslo_concurrency.lockutils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Releasing lock "refresh_cache-c588f3ea-d2a2-45dc-a12b-77e812835ec3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.882 183087 DEBUG nova.compute.manager [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Instance network_info: |[{"id": "eb474fb9-2733-48b3-b52a-cdbefbfaa0b4", "address": "fa:16:3e:03:a4:99", "network": {"id": "1231cec5-dd41-4753-9cd0-6832d14fc7ea", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::f816:3eff:fe03:a499", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.212", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb474fb9-27", "ovs_interfaceid": "eb474fb9-2733-48b3-b52a-cdbefbfaa0b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.883 183087 DEBUG oslo_concurrency.lockutils [req-73069f2a-1aaa-44de-addb-e9bc228ca290 req-7a0301ff-7070-4d30-8307-559ffddf9ba8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-c588f3ea-d2a2-45dc-a12b-77e812835ec3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.883 183087 DEBUG nova.network.neutron [req-73069f2a-1aaa-44de-addb-e9bc228ca290 req-7a0301ff-7070-4d30-8307-559ffddf9ba8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Refreshing network info cache for port eb474fb9-2733-48b3-b52a-cdbefbfaa0b4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.886 183087 INFO nova.compute.manager [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Terminating instance
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.889 183087 DEBUG nova.compute.manager [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.894 183087 DEBUG nova.virt.libvirt.driver [-] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.895 183087 INFO nova.virt.libvirt.driver [-] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Instance destroyed successfully.
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.896 183087 DEBUG nova.virt.libvirt.vif [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:45:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateless',display_name='tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateless',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-extradhcpoptionstest-1086481355-test-extra-dhcp-opts-ip',id=12,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDbcjoHylAQtqg1Vl94jB8Y6+G/LV2rXvLmP1GPnqiVf11BnABlWcWIGa5wBxNzQ5L6qFsQZodIPzUEOuiX2g/H28q0JdYWcV+hRwWSFfpY7UxCYzUgdJhIE18mhMw36ag==',key_name='tempest-ExtraDhcpOptionsTest-1086481355',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e482dc8c944c4dc1ba301e69d00ec101',ramdisk_id='',reservation_id='r-0db0q5tm',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ExtraDhcpOptionsTest-992702786',owner_user_name='tempest-ExtraDhcpOptionsTest-992702786-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:45:31Z,user_data=None,user_id='64bdc9f771e449a8930bafc62d000e64',uuid=c588f3ea-d2a2-45dc-a12b-77e812835ec3,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eb474fb9-2733-48b3-b52a-cdbefbfaa0b4", "address": "fa:16:3e:03:a4:99", "network": {"id": "1231cec5-dd41-4753-9cd0-6832d14fc7ea", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::f816:3eff:fe03:a499", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.212", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb474fb9-27", "ovs_interfaceid": "eb474fb9-2733-48b3-b52a-cdbefbfaa0b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.897 183087 DEBUG nova.network.os_vif_util [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Converting VIF {"id": "eb474fb9-2733-48b3-b52a-cdbefbfaa0b4", "address": "fa:16:3e:03:a4:99", "network": {"id": "1231cec5-dd41-4753-9cd0-6832d14fc7ea", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::f816:3eff:fe03:a499", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.212", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb474fb9-27", "ovs_interfaceid": "eb474fb9-2733-48b3-b52a-cdbefbfaa0b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.899 183087 DEBUG nova.network.os_vif_util [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:a4:99,bridge_name='br-int',has_traffic_filtering=True,id=eb474fb9-2733-48b3-b52a-cdbefbfaa0b4,network=Network(1231cec5-dd41-4753-9cd0-6832d14fc7ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapeb474fb9-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.899 183087 DEBUG os_vif [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:a4:99,bridge_name='br-int',has_traffic_filtering=True,id=eb474fb9-2733-48b3-b52a-cdbefbfaa0b4,network=Network(1231cec5-dd41-4753-9cd0-6832d14fc7ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapeb474fb9-27') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.902 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.903 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb474fb9-27, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.903 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.906 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.910 183087 INFO os_vif [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:a4:99,bridge_name='br-int',has_traffic_filtering=True,id=eb474fb9-2733-48b3-b52a-cdbefbfaa0b4,network=Network(1231cec5-dd41-4753-9cd0-6832d14fc7ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapeb474fb9-27')
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.911 183087 INFO nova.virt.libvirt.driver [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Deleting instance files /var/lib/nova/instances/c588f3ea-d2a2-45dc-a12b-77e812835ec3_del
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.911 183087 INFO nova.virt.libvirt.driver [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Deletion of /var/lib/nova/instances/c588f3ea-d2a2-45dc-a12b-77e812835ec3_del complete
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.927 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417137.868813, 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.927 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] VM Paused (Lifecycle Event)
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.971 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:45:37 compute-1 nova_compute[183083]: 2026-01-26 08:45:37.976 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:45:38 compute-1 podman[213219]: 2026-01-26 08:45:38.003341957 +0000 UTC m=+0.074745898 container create 9898a7d0a553e098aa126af9aa87911661805ac1072fac374f8918ebe28d79c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.034 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.040 183087 DEBUG nova.network.neutron [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:45:38 compute-1 systemd[1]: Started libpod-conmon-9898a7d0a553e098aa126af9aa87911661805ac1072fac374f8918ebe28d79c4.scope.
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.048 183087 INFO nova.compute.manager [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Took 0.16 seconds to destroy the instance on the hypervisor.
Jan 26 08:45:38 compute-1 podman[213219]: 2026-01-26 08:45:37.960263076 +0000 UTC m=+0.031667017 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.052 183087 DEBUG nova.compute.claims [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Aborting claim: <nova.compute.claims.Claim object at 0x7f6cc81641f0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.053 183087 DEBUG oslo_concurrency.lockutils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.054 183087 DEBUG oslo_concurrency.lockutils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:38 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:45:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7bf5f2fe0ea35302d45407473bfbbe722823e2594571d7ebf1abdfb33ec8332f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 08:45:38 compute-1 podman[213219]: 2026-01-26 08:45:38.102165874 +0000 UTC m=+0.173569855 container init 9898a7d0a553e098aa126af9aa87911661805ac1072fac374f8918ebe28d79c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.103 183087 DEBUG nova.network.neutron [req-900ac54f-e1d4-4eb6-acd5-43fac047e94e req-b3461967-d356-41d5-a5ee-9ad3a66fd105 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Updated VIF entry in instance network info cache for port 71bea873-e0d4-4d4e-b05e-ff7415434c11. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.104 183087 DEBUG nova.network.neutron [req-900ac54f-e1d4-4eb6-acd5-43fac047e94e req-b3461967-d356-41d5-a5ee-9ad3a66fd105 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Updating instance_info_cache with network_info: [{"id": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "address": "fa:16:3e:dd:ab:11", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b45c93-b3", "ovs_interfaceid": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "address": "fa:16:3e:81:c4:17", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bea873-e0", "ovs_interfaceid": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:38 compute-1 podman[213219]: 2026-01-26 08:45:38.113177318 +0000 UTC m=+0.184581259 container start 9898a7d0a553e098aa126af9aa87911661805ac1072fac374f8918ebe28d79c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.128 183087 DEBUG oslo_concurrency.lockutils [req-900ac54f-e1d4-4eb6-acd5-43fac047e94e req-b3461967-d356-41d5-a5ee-9ad3a66fd105 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:45:38 compute-1 neutron-haproxy-ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe[213234]: [NOTICE]   (213238) : New worker (213240) forked
Jan 26 08:45:38 compute-1 neutron-haproxy-ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe[213234]: [NOTICE]   (213238) : Loading success.
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.258 183087 DEBUG nova.compute.provider_tree [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.283 183087 DEBUG nova.scheduler.client.report [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.310 183087 DEBUG oslo_concurrency.lockutils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.311 183087 DEBUG nova.compute.utils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.312 183087 ERROR nova.compute.manager [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Build of instance c588f3ea-d2a2-45dc-a12b-77e812835ec3 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance c588f3ea-d2a2-45dc-a12b-77e812835ec3 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.313 183087 DEBUG nova.compute.manager [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.314 183087 DEBUG nova.virt.libvirt.vif [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:45:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateless',display_name='tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateless',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-extradhcpoptionstest-1086481355-test-extra-dhcp-opts-ip',id=12,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDbcjoHylAQtqg1Vl94jB8Y6+G/LV2rXvLmP1GPnqiVf11BnABlWcWIGa5wBxNzQ5L6qFsQZodIPzUEOuiX2g/H28q0JdYWcV+hRwWSFfpY7UxCYzUgdJhIE18mhMw36ag==',key_name='tempest-ExtraDhcpOptionsTest-1086481355',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e482dc8c944c4dc1ba301e69d00ec101',ramdisk_id='',reservation_id='r-0db0q5tm',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ExtraDhcpOptionsTest-992702786',owner_user_name='tempest-ExtraDhcpOptionsTest-992702786-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:45:38Z,user_data=None,user_id='64bdc9f771e449a8930bafc62d000e64',uuid=c588f3ea-d2a2-45dc-a12b-77e812835ec3,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eb474fb9-2733-48b3-b52a-cdbefbfaa0b4", "address": "fa:16:3e:03:a4:99", "network": {"id": "1231cec5-dd41-4753-9cd0-6832d14fc7ea", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::f816:3eff:fe03:a499", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.212", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb474fb9-27", "ovs_interfaceid": "eb474fb9-2733-48b3-b52a-cdbefbfaa0b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.314 183087 DEBUG nova.network.os_vif_util [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Converting VIF {"id": "eb474fb9-2733-48b3-b52a-cdbefbfaa0b4", "address": "fa:16:3e:03:a4:99", "network": {"id": "1231cec5-dd41-4753-9cd0-6832d14fc7ea", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::f816:3eff:fe03:a499", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.212", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb474fb9-27", "ovs_interfaceid": "eb474fb9-2733-48b3-b52a-cdbefbfaa0b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.315 183087 DEBUG nova.network.os_vif_util [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:a4:99,bridge_name='br-int',has_traffic_filtering=True,id=eb474fb9-2733-48b3-b52a-cdbefbfaa0b4,network=Network(1231cec5-dd41-4753-9cd0-6832d14fc7ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapeb474fb9-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.316 183087 DEBUG os_vif [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:a4:99,bridge_name='br-int',has_traffic_filtering=True,id=eb474fb9-2733-48b3-b52a-cdbefbfaa0b4,network=Network(1231cec5-dd41-4753-9cd0-6832d14fc7ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapeb474fb9-27') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.317 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.318 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb474fb9-27, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.318 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.321 183087 INFO os_vif [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:a4:99,bridge_name='br-int',has_traffic_filtering=True,id=eb474fb9-2733-48b3-b52a-cdbefbfaa0b4,network=Network(1231cec5-dd41-4753-9cd0-6832d14fc7ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapeb474fb9-27')
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.322 183087 DEBUG nova.compute.manager [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.322 183087 DEBUG nova.compute.manager [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.323 183087 DEBUG nova.network.neutron [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.545 183087 DEBUG oslo_concurrency.lockutils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "1d002ced-fc02-4688-aa6b-6d11514e01ca" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.547 183087 DEBUG oslo_concurrency.lockutils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "1d002ced-fc02-4688-aa6b-6d11514e01ca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.598 183087 DEBUG nova.compute.manager [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.690 183087 DEBUG oslo_concurrency.lockutils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.691 183087 DEBUG oslo_concurrency.lockutils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.704 183087 DEBUG nova.virt.hardware [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.705 183087 INFO nova.compute.claims [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.889 183087 DEBUG nova.compute.manager [req-9a48d4f1-cf92-4e0d-a0e1-be231b159ec7 req-566eed99-c870-45d5-a123-58ffa06ef6d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Received event network-changed-bde3b3f7-f517-429f-a05a-4c2dd76dc0ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.891 183087 DEBUG nova.compute.manager [req-9a48d4f1-cf92-4e0d-a0e1-be231b159ec7 req-566eed99-c870-45d5-a123-58ffa06ef6d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Refreshing instance network info cache due to event network-changed-bde3b3f7-f517-429f-a05a-4c2dd76dc0ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.892 183087 DEBUG oslo_concurrency.lockutils [req-9a48d4f1-cf92-4e0d-a0e1-be231b159ec7 req-566eed99-c870-45d5-a123-58ffa06ef6d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-31c86c5f-fd35-45f8-af2c-165a04ff46dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:45:38 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:38.921 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:45:38 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:38.922 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.922 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.950 183087 DEBUG nova.compute.provider_tree [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:45:38 compute-1 nova_compute[183083]: 2026-01-26 08:45:38.984 183087 DEBUG nova.scheduler.client.report [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:45:39 compute-1 nova_compute[183083]: 2026-01-26 08:45:39.024 183087 DEBUG oslo_concurrency.lockutils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:39 compute-1 nova_compute[183083]: 2026-01-26 08:45:39.026 183087 DEBUG nova.compute.manager [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:45:39 compute-1 nova_compute[183083]: 2026-01-26 08:45:39.080 183087 DEBUG nova.compute.manager [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:45:39 compute-1 nova_compute[183083]: 2026-01-26 08:45:39.081 183087 DEBUG nova.network.neutron [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:45:39 compute-1 nova_compute[183083]: 2026-01-26 08:45:39.099 183087 INFO nova.virt.libvirt.driver [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:45:39 compute-1 nova_compute[183083]: 2026-01-26 08:45:39.114 183087 DEBUG nova.compute.manager [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:45:39 compute-1 nova_compute[183083]: 2026-01-26 08:45:39.244 183087 DEBUG nova.compute.manager [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:45:39 compute-1 nova_compute[183083]: 2026-01-26 08:45:39.246 183087 DEBUG nova.virt.libvirt.driver [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:45:39 compute-1 nova_compute[183083]: 2026-01-26 08:45:39.246 183087 INFO nova.virt.libvirt.driver [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Creating image(s)
Jan 26 08:45:39 compute-1 nova_compute[183083]: 2026-01-26 08:45:39.248 183087 DEBUG oslo_concurrency.lockutils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "/var/lib/nova/instances/1d002ced-fc02-4688-aa6b-6d11514e01ca/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:39 compute-1 nova_compute[183083]: 2026-01-26 08:45:39.248 183087 DEBUG oslo_concurrency.lockutils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "/var/lib/nova/instances/1d002ced-fc02-4688-aa6b-6d11514e01ca/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:39 compute-1 nova_compute[183083]: 2026-01-26 08:45:39.249 183087 DEBUG oslo_concurrency.lockutils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "/var/lib/nova/instances/1d002ced-fc02-4688-aa6b-6d11514e01ca/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:39 compute-1 nova_compute[183083]: 2026-01-26 08:45:39.250 183087 DEBUG oslo_concurrency.lockutils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:39 compute-1 nova_compute[183083]: 2026-01-26 08:45:39.251 183087 DEBUG oslo_concurrency.lockutils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:39 compute-1 sshd-session[213249]: Connection closed by authenticating user root 159.223.236.81 port 60022 [preauth]
Jan 26 08:45:39 compute-1 nova_compute[183083]: 2026-01-26 08:45:39.446 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:39 compute-1 nova_compute[183083]: 2026-01-26 08:45:39.512 183087 DEBUG nova.policy [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a7abeebb4e4d469c91e6cee77f6be1c3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b71ae2b9d2fd454b8b3b9aa1a0e5c7e4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:45:39 compute-1 nova_compute[183083]: 2026-01-26 08:45:39.987 183087 DEBUG nova.network.neutron [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Updating instance_info_cache with network_info: [{"id": "bde3b3f7-f517-429f-a05a-4c2dd76dc0ae", "address": "fa:16:3e:8f:42:3a", "network": {"id": "f7e94474-b188-4696-a974-b6f64972f94c", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.166", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:2::/64", "dns": [], "gateway": {"address": "2001:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:2::81", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde3b3f7-f5", "ovs_interfaceid": "bde3b3f7-f517-429f-a05a-4c2dd76dc0ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.011 183087 DEBUG oslo_concurrency.lockutils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Releasing lock "refresh_cache-31c86c5f-fd35-45f8-af2c-165a04ff46dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.011 183087 DEBUG nova.compute.manager [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Instance network_info: |[{"id": "bde3b3f7-f517-429f-a05a-4c2dd76dc0ae", "address": "fa:16:3e:8f:42:3a", "network": {"id": "f7e94474-b188-4696-a974-b6f64972f94c", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.166", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:2::/64", "dns": [], "gateway": {"address": "2001:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:2::81", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde3b3f7-f5", "ovs_interfaceid": "bde3b3f7-f517-429f-a05a-4c2dd76dc0ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.011 183087 DEBUG oslo_concurrency.lockutils [req-9a48d4f1-cf92-4e0d-a0e1-be231b159ec7 req-566eed99-c870-45d5-a123-58ffa06ef6d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-31c86c5f-fd35-45f8-af2c-165a04ff46dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.012 183087 DEBUG nova.network.neutron [req-9a48d4f1-cf92-4e0d-a0e1-be231b159ec7 req-566eed99-c870-45d5-a123-58ffa06ef6d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Refreshing network info cache for port bde3b3f7-f517-429f-a05a-4c2dd76dc0ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.013 183087 INFO nova.compute.manager [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Terminating instance
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.014 183087 DEBUG nova.compute.manager [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.019 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.019 183087 INFO nova.virt.libvirt.driver [-] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Instance destroyed successfully.
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.021 183087 DEBUG nova.virt.libvirt.vif [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:45:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_dhcp6',display_name='tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_dhcp6',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-ovnextradhcpoptionstest-154566894-test-extra-dhcp-opts',id=13,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlGW+ytSlZhYgiyJ2VQXbYJFNaJudegvaO8arsRLbrIovURYus5bHVwpbhRN5uNXuMSQSnvVfbQff2hzPqkCc7D0963u0+MNy7sXLIRMgXOJArypDGQ7Qj3yNlotCHO0A==',key_name='tempest-OvnExtraDhcpOptionsTest-154566894',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71cced1777f24868932d789154ff04a0',ramdisk_id='',reservation_id='r-eyps23p6',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-OvnExtraDhcpOptionsTest-1195747792',owner_user_name='tempest-OvnExtraDhcpOptionsTest-1195747792-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:45:34Z,user_data=None,user_id='41a09f4d7f034b1c85f20c9512d33411',uuid=31c86c5f-fd35-45f8-af2c-165a04ff46dc,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bde3b3f7-f517-429f-a05a-4c2dd76dc0ae", "address": "fa:16:3e:8f:42:3a", "network": {"id": "f7e94474-b188-4696-a974-b6f64972f94c", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.166", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:2::/64", "dns": [], "gateway": {"address": "2001:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:2::81", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde3b3f7-f5", "ovs_interfaceid": "bde3b3f7-f517-429f-a05a-4c2dd76dc0ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.021 183087 DEBUG nova.network.os_vif_util [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Converting VIF {"id": "bde3b3f7-f517-429f-a05a-4c2dd76dc0ae", "address": "fa:16:3e:8f:42:3a", "network": {"id": "f7e94474-b188-4696-a974-b6f64972f94c", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.166", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:2::/64", "dns": [], "gateway": {"address": "2001:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:2::81", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde3b3f7-f5", "ovs_interfaceid": "bde3b3f7-f517-429f-a05a-4c2dd76dc0ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.022 183087 DEBUG nova.network.os_vif_util [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:42:3a,bridge_name='br-int',has_traffic_filtering=True,id=bde3b3f7-f517-429f-a05a-4c2dd76dc0ae,network=Network(f7e94474-b188-4696-a974-b6f64972f94c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde3b3f7-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.023 183087 DEBUG os_vif [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:42:3a,bridge_name='br-int',has_traffic_filtering=True,id=bde3b3f7-f517-429f-a05a-4c2dd76dc0ae,network=Network(f7e94474-b188-4696-a974-b6f64972f94c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde3b3f7-f5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.026 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.026 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbde3b3f7-f5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.026 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.033 183087 INFO os_vif [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:42:3a,bridge_name='br-int',has_traffic_filtering=True,id=bde3b3f7-f517-429f-a05a-4c2dd76dc0ae,network=Network(f7e94474-b188-4696-a974-b6f64972f94c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde3b3f7-f5')
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.033 183087 INFO nova.virt.libvirt.driver [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Deleting instance files /var/lib/nova/instances/31c86c5f-fd35-45f8-af2c-165a04ff46dc_del
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.034 183087 INFO nova.virt.libvirt.driver [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Deletion of /var/lib/nova/instances/31c86c5f-fd35-45f8-af2c-165a04ff46dc_del complete
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.107 183087 INFO nova.compute.manager [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Took 0.09 seconds to destroy the instance on the hypervisor.
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.109 183087 DEBUG nova.compute.claims [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c98423250> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.110 183087 DEBUG oslo_concurrency.lockutils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.110 183087 DEBUG oslo_concurrency.lockutils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.231 183087 DEBUG nova.network.neutron [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Updating instance_info_cache with network_info: [{"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.248 183087 DEBUG oslo_concurrency.lockutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Releasing lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.250 183087 DEBUG nova.compute.manager [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.353 183087 DEBUG nova.compute.provider_tree [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.384 183087 DEBUG nova.scheduler.client.report [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:45:40 compute-1 kernel: tapfe874242-d6 (unregistering): left promiscuous mode
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.414 183087 DEBUG oslo_concurrency.lockutils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.416 183087 DEBUG nova.compute.utils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.417 183087 ERROR nova.compute.manager [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Build of instance 31c86c5f-fd35-45f8-af2c-165a04ff46dc aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 31c86c5f-fd35-45f8-af2c-165a04ff46dc aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.418 183087 DEBUG nova.compute.manager [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.419 183087 DEBUG nova.virt.libvirt.vif [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:45:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_dhcp6',display_name='tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_dhcp6',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-ovnextradhcpoptionstest-154566894-test-extra-dhcp-opts',id=13,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlGW+ytSlZhYgiyJ2VQXbYJFNaJudegvaO8arsRLbrIovURYus5bHVwpbhRN5uNXuMSQSnvVfbQff2hzPqkCc7D0963u0+MNy7sXLIRMgXOJArypDGQ7Qj3yNlotCHO0A==',key_name='tempest-OvnExtraDhcpOptionsTest-154566894',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71cced1777f24868932d789154ff04a0',ramdisk_id='',reservation_id='r-eyps23p6',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-OvnExtraDhcpOptionsTest-1195747792',owner_user_name='tempest-OvnExtraDhcpOptionsTest-1195747792-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:45:40Z,user_data=None,user_id='41a09f4d7f034b1c85f20c9512d33411',uuid=31c86c5f-fd35-45f8-af2c-165a04ff46dc,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bde3b3f7-f517-429f-a05a-4c2dd76dc0ae", "address": "fa:16:3e:8f:42:3a", "network": {"id": "f7e94474-b188-4696-a974-b6f64972f94c", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.166", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:2::/64", "dns": [], "gateway": {"address": "2001:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:2::81", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde3b3f7-f5", "ovs_interfaceid": "bde3b3f7-f517-429f-a05a-4c2dd76dc0ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:45:40 compute-1 NetworkManager[55451]: <info>  [1769417140.4218] device (tapfe874242-d6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.421 183087 DEBUG nova.network.os_vif_util [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Converting VIF {"id": "bde3b3f7-f517-429f-a05a-4c2dd76dc0ae", "address": "fa:16:3e:8f:42:3a", "network": {"id": "f7e94474-b188-4696-a974-b6f64972f94c", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.166", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:2::/64", "dns": [], "gateway": {"address": "2001:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:2::81", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde3b3f7-f5", "ovs_interfaceid": "bde3b3f7-f517-429f-a05a-4c2dd76dc0ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.424 183087 DEBUG nova.network.os_vif_util [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:42:3a,bridge_name='br-int',has_traffic_filtering=True,id=bde3b3f7-f517-429f-a05a-4c2dd76dc0ae,network=Network(f7e94474-b188-4696-a974-b6f64972f94c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde3b3f7-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.424 183087 DEBUG os_vif [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:42:3a,bridge_name='br-int',has_traffic_filtering=True,id=bde3b3f7-f517-429f-a05a-4c2dd76dc0ae,network=Network(f7e94474-b188-4696-a974-b6f64972f94c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde3b3f7-f5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.426 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.428 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbde3b3f7-f5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.428 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.429 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.431 183087 INFO os_vif [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:42:3a,bridge_name='br-int',has_traffic_filtering=True,id=bde3b3f7-f517-429f-a05a-4c2dd76dc0ae,network=Network(f7e94474-b188-4696-a974-b6f64972f94c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde3b3f7-f5')
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.432 183087 DEBUG nova.compute.manager [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.432 183087 DEBUG nova.compute.manager [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.432 183087 DEBUG nova.network.neutron [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.439 183087 DEBUG oslo_concurrency.lockutils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Traceback (most recent call last):
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]     raise exception.ImageUnacceptable(
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] 
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] During handling of the above exception, another exception occurred:
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] 
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Traceback (most recent call last):
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]     yield resources
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]     created_disks = self._create_and_inject_local_root(
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]     image.cache(fetch_func=fetch_func,
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]     return f(*args, **kwargs)
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca]     raise exception.ImageUnacceptable(
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.440 183087 ERROR nova.compute.manager [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] 
Jan 26 08:45:40 compute-1 ovn_controller[95352]: 2026-01-26T08:45:40Z|00068|binding|INFO|Releasing lport fe874242-d6f2-4922-8e79-b6545c0e8446 from this chassis (sb_readonly=0)
Jan 26 08:45:40 compute-1 ovn_controller[95352]: 2026-01-26T08:45:40Z|00069|binding|INFO|Setting lport fe874242-d6f2-4922-8e79-b6545c0e8446 down in Southbound
Jan 26 08:45:40 compute-1 ovn_controller[95352]: 2026-01-26T08:45:40Z|00070|binding|INFO|Removing iface tapfe874242-d6 ovn-installed in OVS
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.465 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.470 183087 DEBUG nova.network.neutron [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:40.474 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:29:f7 10.100.0.24'], port_security=['fa:16:3e:48:29:f7 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': '52d0b676-cf9c-4840-8b66-74ca8b13e2af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '605ef5b310d9405faa10f9c8f78d897f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '18ebfc50-0648-455b-8cfa-279844a56dcf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.173'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70edddae-db7d-4a4d-8983-c3a06c961ec1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=fe874242-d6f2-4922-8e79-b6545c0e8446) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:45:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:40.476 104632 INFO neutron.agent.ovn.metadata.agent [-] Port fe874242-d6f2-4922-8e79-b6545c0e8446 in datapath fe7e4448-8407-46f2-95a0-344b2f6ecfd7 unbound from our chassis
Jan 26 08:45:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:40.480 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fe7e4448-8407-46f2-95a0-344b2f6ecfd7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 08:45:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:40.481 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[969cf16e-205a-4311-95aa-4d43170ec32b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:40.482 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7 namespace which is not needed anymore
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.496 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.503 183087 INFO nova.compute.manager [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Took 2.18 seconds to deallocate network for instance.
Jan 26 08:45:40 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000009.scope: Deactivated successfully.
Jan 26 08:45:40 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000009.scope: Consumed 12.374s CPU time.
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.527 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:40 compute-1 systemd-machined[154360]: Machine qemu-2-instance-00000009 terminated.
Jan 26 08:45:40 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[212910]: [NOTICE]   (212914) : haproxy version is 2.8.14-c23fe91
Jan 26 08:45:40 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[212910]: [NOTICE]   (212914) : path to executable is /usr/sbin/haproxy
Jan 26 08:45:40 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[212910]: [WARNING]  (212914) : Exiting Master process...
Jan 26 08:45:40 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[212910]: [WARNING]  (212914) : Exiting Master process...
Jan 26 08:45:40 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[212910]: [ALERT]    (212914) : Current worker (212916) exited with code 143 (Terminated)
Jan 26 08:45:40 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[212910]: [WARNING]  (212914) : All workers exited. Exiting... (0)
Jan 26 08:45:40 compute-1 systemd[1]: libpod-2014e17c7ed264286fc301b0e51e3bf8ea8e5da907f9c0924a9b2494c212b51a.scope: Deactivated successfully.
Jan 26 08:45:40 compute-1 podman[213275]: 2026-01-26 08:45:40.659643918 +0000 UTC m=+0.067303597 container died 2014e17c7ed264286fc301b0e51e3bf8ea8e5da907f9c0924a9b2494c212b51a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.669 183087 INFO nova.virt.libvirt.driver [-] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Instance destroyed successfully.
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.670 183087 DEBUG nova.objects.instance [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lazy-loading 'resources' on Instance uuid 52d0b676-cf9c-4840-8b66-74ca8b13e2af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.687 183087 INFO nova.scheduler.client.report [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Deleted allocations for instance c588f3ea-d2a2-45dc-a12b-77e812835ec3
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.688 183087 DEBUG oslo_concurrency.lockutils [None req-f4e1af33-7ac5-42ae-aec4-499145896425 64bdc9f771e449a8930bafc62d000e64 e482dc8c944c4dc1ba301e69d00ec101 - - default default] Lock "c588f3ea-d2a2-45dc-a12b-77e812835ec3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.576s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:40 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2014e17c7ed264286fc301b0e51e3bf8ea8e5da907f9c0924a9b2494c212b51a-userdata-shm.mount: Deactivated successfully.
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.689 183087 DEBUG nova.virt.libvirt.vif [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T08:44:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1487231681',display_name='tempest-server-test-1487231681',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1487231681',id=9,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKwpz9wNHHnbJAOOiUppbeyhm9nehnF1Htd8OXU0NdYnRfrosih4iG9UOQxrdGQsm8346olWW9k6G/UQO8gqKcXblbCmZkMf68uZMMkWh2h2m7GZ6H2smfZpLCUdxJAWPg==',key_name='tempest-keypair-test-84578535',keypairs=<?>,launch_index=0,launched_at=2026-01-26T08:44:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='605ef5b310d9405faa10f9c8f78d897f',ramdisk_id='',reservation_id='r-e8sz2804',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkBasicTest-1834146433',owner_user_name='tempest-NetworkBasicTest-1834146433-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T08:45:40Z,user_data=None,user_id='4ba65f88f5c349ff8443da8191c3da2b',uuid=52d0b676-cf9c-4840-8b66-74ca8b13e2af,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.690 183087 DEBUG nova.network.os_vif_util [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Converting VIF {"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.690 183087 DEBUG nova.network.os_vif_util [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:48:29:f7,bridge_name='br-int',has_traffic_filtering=True,id=fe874242-d6f2-4922-8e79-b6545c0e8446,network=Network(fe7e4448-8407-46f2-95a0-344b2f6ecfd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe874242-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.691 183087 DEBUG os_vif [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:48:29:f7,bridge_name='br-int',has_traffic_filtering=True,id=fe874242-d6f2-4922-8e79-b6545c0e8446,network=Network(fe7e4448-8407-46f2-95a0-344b2f6ecfd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe874242-d6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:45:40 compute-1 systemd[1]: var-lib-containers-storage-overlay-0891875b93cad0324299f665c0f2d125b0ecf26cf71db91966141bb911ab4dea-merged.mount: Deactivated successfully.
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.692 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.692 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe874242-d6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.694 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.697 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:45:40 compute-1 podman[213275]: 2026-01-26 08:45:40.69851489 +0000 UTC m=+0.106174569 container cleanup 2014e17c7ed264286fc301b0e51e3bf8ea8e5da907f9c0924a9b2494c212b51a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.699 183087 INFO os_vif [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:48:29:f7,bridge_name='br-int',has_traffic_filtering=True,id=fe874242-d6f2-4922-8e79-b6545c0e8446,network=Network(fe7e4448-8407-46f2-95a0-344b2f6ecfd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe874242-d6')
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.701 183087 DEBUG nova.virt.libvirt.host [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.701 183087 INFO nova.virt.libvirt.host [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] UEFI support detected
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.707 183087 DEBUG nova.virt.libvirt.driver [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Start _get_guest_xml network_info=[{"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encryption_options': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 08:45:40 compute-1 systemd[1]: libpod-conmon-2014e17c7ed264286fc301b0e51e3bf8ea8e5da907f9c0924a9b2494c212b51a.scope: Deactivated successfully.
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.712 183087 WARNING nova.virt.libvirt.driver [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.716 183087 DEBUG nova.virt.libvirt.host [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.717 183087 DEBUG nova.virt.libvirt.host [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.719 183087 DEBUG nova.virt.libvirt.host [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.720 183087 DEBUG nova.virt.libvirt.host [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.720 183087 DEBUG nova.virt.libvirt.driver [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.721 183087 DEBUG nova.virt.hardware [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T08:43:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7017323-cd35-470d-a718-faa6c6e97277',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.721 183087 DEBUG nova.virt.hardware [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.722 183087 DEBUG nova.virt.hardware [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.722 183087 DEBUG nova.virt.hardware [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.723 183087 DEBUG nova.virt.hardware [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.723 183087 DEBUG nova.virt.hardware [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.723 183087 DEBUG nova.virt.hardware [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.724 183087 DEBUG nova.virt.hardware [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.724 183087 DEBUG nova.virt.hardware [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.724 183087 DEBUG nova.virt.hardware [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.725 183087 DEBUG nova.virt.hardware [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.725 183087 DEBUG nova.objects.instance [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lazy-loading 'vcpu_model' on Instance uuid 52d0b676-cf9c-4840-8b66-74ca8b13e2af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.740 183087 DEBUG oslo_concurrency.processutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:45:40 compute-1 podman[213320]: 2026-01-26 08:45:40.786919839 +0000 UTC m=+0.057521934 container remove 2014e17c7ed264286fc301b0e51e3bf8ea8e5da907f9c0924a9b2494c212b51a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 26 08:45:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:40.794 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[834c4681-8789-43a7-b12e-14a5466e18ea]: (4, ('Mon Jan 26 08:45:40 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7 (2014e17c7ed264286fc301b0e51e3bf8ea8e5da907f9c0924a9b2494c212b51a)\n2014e17c7ed264286fc301b0e51e3bf8ea8e5da907f9c0924a9b2494c212b51a\nMon Jan 26 08:45:40 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7 (2014e17c7ed264286fc301b0e51e3bf8ea8e5da907f9c0924a9b2494c212b51a)\n2014e17c7ed264286fc301b0e51e3bf8ea8e5da907f9c0924a9b2494c212b51a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:40.796 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[1d1260c4-d90c-40f6-a8a7-1c2d0e9a01bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:40.797 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe7e4448-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.821 183087 DEBUG oslo_concurrency.processutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk.config --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.822 183087 DEBUG oslo_concurrency.lockutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquiring lock "/var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.823 183087 DEBUG oslo_concurrency.lockutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "/var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.824 183087 DEBUG oslo_concurrency.lockutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "/var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.826 183087 DEBUG nova.virt.libvirt.vif [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T08:44:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1487231681',display_name='tempest-server-test-1487231681',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1487231681',id=9,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKwpz9wNHHnbJAOOiUppbeyhm9nehnF1Htd8OXU0NdYnRfrosih4iG9UOQxrdGQsm8346olWW9k6G/UQO8gqKcXblbCmZkMf68uZMMkWh2h2m7GZ6H2smfZpLCUdxJAWPg==',key_name='tempest-keypair-test-84578535',keypairs=<?>,launch_index=0,launched_at=2026-01-26T08:44:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='605ef5b310d9405faa10f9c8f78d897f',ramdisk_id='',reservation_id='r-e8sz2804',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkBasicTest-1834146433',owner_user_name='tempest-NetworkBasicTest-1834146433-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T08:45:40Z,user_data=None,user_id='4ba65f88f5c349ff8443da8191c3da2b',uuid=52d0b676-cf9c-4840-8b66-74ca8b13e2af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.826 183087 DEBUG nova.network.os_vif_util [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Converting VIF {"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.828 183087 DEBUG nova.network.os_vif_util [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:48:29:f7,bridge_name='br-int',has_traffic_filtering=True,id=fe874242-d6f2-4922-8e79-b6545c0e8446,network=Network(fe7e4448-8407-46f2-95a0-344b2f6ecfd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe874242-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.829 183087 DEBUG nova.objects.instance [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lazy-loading 'pci_devices' on Instance uuid 52d0b676-cf9c-4840-8b66-74ca8b13e2af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:45:40 compute-1 kernel: tapfe7e4448-80: left promiscuous mode
Jan 26 08:45:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:40.854 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[2707f7b9-9eb8-4020-8b76-746729f0f37f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.854 183087 DEBUG nova.virt.libvirt.driver [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] End _get_guest_xml xml=<domain type="kvm">
Jan 26 08:45:40 compute-1 nova_compute[183083]:   <uuid>52d0b676-cf9c-4840-8b66-74ca8b13e2af</uuid>
Jan 26 08:45:40 compute-1 nova_compute[183083]:   <name>instance-00000009</name>
Jan 26 08:45:40 compute-1 nova_compute[183083]:   <memory>131072</memory>
Jan 26 08:45:40 compute-1 nova_compute[183083]:   <vcpu>1</vcpu>
Jan 26 08:45:40 compute-1 nova_compute[183083]:   <metadata>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <nova:name>tempest-server-test-1487231681</nova:name>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <nova:creationTime>2026-01-26 08:45:40</nova:creationTime>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <nova:flavor name="m1.nano">
Jan 26 08:45:40 compute-1 nova_compute[183083]:         <nova:memory>128</nova:memory>
Jan 26 08:45:40 compute-1 nova_compute[183083]:         <nova:disk>1</nova:disk>
Jan 26 08:45:40 compute-1 nova_compute[183083]:         <nova:swap>0</nova:swap>
Jan 26 08:45:40 compute-1 nova_compute[183083]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 08:45:40 compute-1 nova_compute[183083]:         <nova:vcpus>1</nova:vcpus>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       </nova:flavor>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <nova:owner>
Jan 26 08:45:40 compute-1 nova_compute[183083]:         <nova:user uuid="4ba65f88f5c349ff8443da8191c3da2b">tempest-NetworkBasicTest-1834146433-project-member</nova:user>
Jan 26 08:45:40 compute-1 nova_compute[183083]:         <nova:project uuid="605ef5b310d9405faa10f9c8f78d897f">tempest-NetworkBasicTest-1834146433</nova:project>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       </nova:owner>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <nova:root type="image" uuid="13d1a20a-8003-4f19-aba7-ccbd9eff9b82"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <nova:ports>
Jan 26 08:45:40 compute-1 nova_compute[183083]:         <nova:port uuid="fe874242-d6f2-4922-8e79-b6545c0e8446">
Jan 26 08:45:40 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       </nova:ports>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     </nova:instance>
Jan 26 08:45:40 compute-1 nova_compute[183083]:   </metadata>
Jan 26 08:45:40 compute-1 nova_compute[183083]:   <sysinfo type="smbios">
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <system>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <entry name="manufacturer">RDO</entry>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <entry name="product">OpenStack Compute</entry>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <entry name="serial">52d0b676-cf9c-4840-8b66-74ca8b13e2af</entry>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <entry name="uuid">52d0b676-cf9c-4840-8b66-74ca8b13e2af</entry>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <entry name="family">Virtual Machine</entry>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     </system>
Jan 26 08:45:40 compute-1 nova_compute[183083]:   </sysinfo>
Jan 26 08:45:40 compute-1 nova_compute[183083]:   <os>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <boot dev="hd"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <smbios mode="sysinfo"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:   </os>
Jan 26 08:45:40 compute-1 nova_compute[183083]:   <features>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <acpi/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <apic/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <vmcoreinfo/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:   </features>
Jan 26 08:45:40 compute-1 nova_compute[183083]:   <clock offset="utc">
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <timer name="hpet" present="no"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:   </clock>
Jan 26 08:45:40 compute-1 nova_compute[183083]:   <cpu mode="host-model" match="exact">
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:   </cpu>
Jan 26 08:45:40 compute-1 nova_compute[183083]:   <devices>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <disk type="file" device="disk">
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <target dev="vda" bus="virtio"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <disk type="file" device="cdrom">
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk.config"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <target dev="sda" bus="sata"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:48:29:f7"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <target dev="tapfe874242-d6"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <serial type="pty">
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <log file="/var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/console.log" append="off"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     </serial>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <video>
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     </video>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <input type="tablet" bus="usb"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <input type="keyboard" bus="usb"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <rng model="virtio">
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <backend model="random">/dev/urandom</backend>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     </rng>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <controller type="usb" index="0"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     <memballoon model="virtio">
Jan 26 08:45:40 compute-1 nova_compute[183083]:       <stats period="10"/>
Jan 26 08:45:40 compute-1 nova_compute[183083]:     </memballoon>
Jan 26 08:45:40 compute-1 nova_compute[183083]:   </devices>
Jan 26 08:45:40 compute-1 nova_compute[183083]: </domain>
Jan 26 08:45:40 compute-1 nova_compute[183083]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.856 183087 DEBUG oslo_concurrency.processutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.874 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:40.881 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[55b14cd6-8647-4c99-ab6d-1b2342facbb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:40.882 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[c36dba88-fb21-4dcd-93d2-16491db1d56c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:40.897 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[13a9edf4-6e45-48e4-b63b-fc5184c075f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345902, 'reachable_time': 39577, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213339, 'error': None, 'target': 'ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:40.899 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 08:45:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:40.899 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[90e2ac9c-0ca2-4adb-b988-5d921cb7abbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:40 compute-1 systemd[1]: run-netns-ovnmeta\x2dfe7e4448\x2d8407\x2d46f2\x2d95a0\x2d344b2f6ecfd7.mount: Deactivated successfully.
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.919 183087 DEBUG oslo_concurrency.processutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.920 183087 DEBUG oslo_concurrency.processutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.981 183087 DEBUG oslo_concurrency.processutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.983 183087 DEBUG nova.objects.instance [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lazy-loading 'trusted_certs' on Instance uuid 52d0b676-cf9c-4840-8b66-74ca8b13e2af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:45:40 compute-1 nova_compute[183083]: 2026-01-26 08:45:40.996 183087 DEBUG oslo_concurrency.processutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.058 183087 DEBUG oslo_concurrency.processutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.060 183087 DEBUG nova.virt.disk.api [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Checking if we can resize image /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.060 183087 DEBUG oslo_concurrency.processutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.134 183087 DEBUG oslo_concurrency.processutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.135 183087 DEBUG nova.virt.disk.api [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Cannot resize image /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.137 183087 DEBUG nova.objects.instance [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lazy-loading 'migration_context' on Instance uuid 52d0b676-cf9c-4840-8b66-74ca8b13e2af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.143 183087 DEBUG nova.network.neutron [req-73069f2a-1aaa-44de-addb-e9bc228ca290 req-7a0301ff-7070-4d30-8307-559ffddf9ba8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Updated VIF entry in instance network info cache for port eb474fb9-2733-48b3-b52a-cdbefbfaa0b4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.143 183087 DEBUG nova.network.neutron [req-73069f2a-1aaa-44de-addb-e9bc228ca290 req-7a0301ff-7070-4d30-8307-559ffddf9ba8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: c588f3ea-d2a2-45dc-a12b-77e812835ec3] Updating instance_info_cache with network_info: [{"id": "eb474fb9-2733-48b3-b52a-cdbefbfaa0b4", "address": "fa:16:3e:03:a4:99", "network": {"id": "1231cec5-dd41-4753-9cd0-6832d14fc7ea", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1086481355-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::f816:3eff:fe03:a499", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}, {"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.212", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e482dc8c944c4dc1ba301e69d00ec101", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb474fb9-27", "ovs_interfaceid": "eb474fb9-2733-48b3-b52a-cdbefbfaa0b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.157 183087 DEBUG nova.virt.libvirt.vif [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T08:44:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1487231681',display_name='tempest-server-test-1487231681',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1487231681',id=9,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKwpz9wNHHnbJAOOiUppbeyhm9nehnF1Htd8OXU0NdYnRfrosih4iG9UOQxrdGQsm8346olWW9k6G/UQO8gqKcXblbCmZkMf68uZMMkWh2h2m7GZ6H2smfZpLCUdxJAWPg==',key_name='tempest-keypair-test-84578535',keypairs=<?>,launch_index=0,launched_at=2026-01-26T08:44:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='605ef5b310d9405faa10f9c8f78d897f',ramdisk_id='',reservation_id='r-e8sz2804',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkBasicTest-1834146433',owner_user_name='tempest-NetworkBasicTest-1834146433-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:45:40Z,user_data=None,user_id='4ba65f88f5c349ff8443da8191c3da2b',uuid=52d0b676-cf9c-4840-8b66-74ca8b13e2af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.157 183087 DEBUG nova.network.os_vif_util [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Converting VIF {"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.159 183087 DEBUG nova.network.os_vif_util [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:48:29:f7,bridge_name='br-int',has_traffic_filtering=True,id=fe874242-d6f2-4922-8e79-b6545c0e8446,network=Network(fe7e4448-8407-46f2-95a0-344b2f6ecfd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe874242-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.159 183087 DEBUG os_vif [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:48:29:f7,bridge_name='br-int',has_traffic_filtering=True,id=fe874242-d6f2-4922-8e79-b6545c0e8446,network=Network(fe7e4448-8407-46f2-95a0-344b2f6ecfd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe874242-d6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.160 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.161 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.162 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.164 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.165 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe874242-d6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.165 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfe874242-d6, col_values=(('external_ids', {'iface-id': 'fe874242-d6f2-4922-8e79-b6545c0e8446', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:48:29:f7', 'vm-uuid': '52d0b676-cf9c-4840-8b66-74ca8b13e2af'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:41 compute-1 NetworkManager[55451]: <info>  [1769417141.1694] manager: (tapfe874242-d6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.171 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.174 183087 DEBUG oslo_concurrency.lockutils [req-73069f2a-1aaa-44de-addb-e9bc228ca290 req-7a0301ff-7070-4d30-8307-559ffddf9ba8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-c588f3ea-d2a2-45dc-a12b-77e812835ec3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.176 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.178 183087 INFO os_vif [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:48:29:f7,bridge_name='br-int',has_traffic_filtering=True,id=fe874242-d6f2-4922-8e79-b6545c0e8446,network=Network(fe7e4448-8407-46f2-95a0-344b2f6ecfd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe874242-d6')
Jan 26 08:45:41 compute-1 kernel: tapfe874242-d6: entered promiscuous mode
Jan 26 08:45:41 compute-1 NetworkManager[55451]: <info>  [1769417141.2758] manager: (tapfe874242-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.276 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:41 compute-1 systemd-udevd[213254]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:45:41 compute-1 ovn_controller[95352]: 2026-01-26T08:45:41Z|00071|binding|INFO|Claiming lport fe874242-d6f2-4922-8e79-b6545c0e8446 for this chassis.
Jan 26 08:45:41 compute-1 ovn_controller[95352]: 2026-01-26T08:45:41Z|00072|binding|INFO|fe874242-d6f2-4922-8e79-b6545c0e8446: Claiming fa:16:3e:48:29:f7 10.100.0.24
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.288 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:29:f7 10.100.0.24'], port_security=['fa:16:3e:48:29:f7 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': '52d0b676-cf9c-4840-8b66-74ca8b13e2af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '605ef5b310d9405faa10f9c8f78d897f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '18ebfc50-0648-455b-8cfa-279844a56dcf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.173'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70edddae-db7d-4a4d-8983-c3a06c961ec1, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=fe874242-d6f2-4922-8e79-b6545c0e8446) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.291 104632 INFO neutron.agent.ovn.metadata.agent [-] Port fe874242-d6f2-4922-8e79-b6545c0e8446 in datapath fe7e4448-8407-46f2-95a0-344b2f6ecfd7 bound to our chassis
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.298 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fe7e4448-8407-46f2-95a0-344b2f6ecfd7
Jan 26 08:45:41 compute-1 NetworkManager[55451]: <info>  [1769417141.3016] device (tapfe874242-d6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 08:45:41 compute-1 NetworkManager[55451]: <info>  [1769417141.3035] device (tapfe874242-d6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 08:45:41 compute-1 ovn_controller[95352]: 2026-01-26T08:45:41Z|00073|binding|INFO|Setting lport fe874242-d6f2-4922-8e79-b6545c0e8446 ovn-installed in OVS
Jan 26 08:45:41 compute-1 ovn_controller[95352]: 2026-01-26T08:45:41Z|00074|binding|INFO|Setting lport fe874242-d6f2-4922-8e79-b6545c0e8446 up in Southbound
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.307 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.313 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[6fa68d9d-1138-47d5-be4d-102b6f14402f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.315 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfe7e4448-81 in ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.317 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfe7e4448-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.318 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[ec5169eb-e33a-4a94-aa42-0e009b47d642]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.319 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[c1e56930-1a80-48ae-935d-b1d6d8de4aeb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:41 compute-1 systemd-machined[154360]: New machine qemu-4-instance-00000009.
Jan 26 08:45:41 compute-1 systemd[1]: Started Virtual Machine qemu-4-instance-00000009.
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.337 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[c01d9559-5b86-4e37-ab51-09bc0a5b2e20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.346 183087 DEBUG nova.network.neutron [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Successfully created port: 1d4d65f6-7230-441c-84cd-576c0a0a4ff3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.359 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[f51b83a3-a707-495c-9c4d-e7ba32b7bff2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.394 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[cdf12dc4-d3d6-4664-a7fa-b118a8bf78f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.402 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[1c667de4-5450-4a7f-b050-b87aef8ea576]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:41 compute-1 NetworkManager[55451]: <info>  [1769417141.4044] manager: (tapfe7e4448-80): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.437 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[e93730cd-5f1a-4509-bc92-7fe757ab5ab1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.443 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[99a9c9e8-a511-44f8-b9b3-e2408d197f4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:41 compute-1 NetworkManager[55451]: <info>  [1769417141.4758] device (tapfe7e4448-80): carrier: link connected
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.482 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[54add075-83c6-4dc3-9a57-dd5fba535890]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.506 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[16c2211a-7f5e-48a1-a8c7-78e59f28aab1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe7e4448-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:af:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348208, 'reachable_time': 39903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213398, 'error': None, 'target': 'ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.527 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[5d028237-15c5-4bdb-a3de-32f45e8dd5be]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:aff4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348208, 'tstamp': 348208}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213399, 'error': None, 'target': 'ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.551 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[7e4f65a0-071b-432e-a765-4dab0378d2c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe7e4448-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:af:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348208, 'reachable_time': 39903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213400, 'error': None, 'target': 'ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.595 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[17f5c382-2045-4ceb-af29-b6f4bc95ad4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.681 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[9dd85c2c-427c-4d06-845c-7df173f95927]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.683 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe7e4448-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.684 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.685 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe7e4448-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:41 compute-1 kernel: tapfe7e4448-80: entered promiscuous mode
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.687 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:41 compute-1 NetworkManager[55451]: <info>  [1769417141.6900] manager: (tapfe7e4448-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.691 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.693 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfe7e4448-80, col_values=(('external_ids', {'iface-id': 'cca059f9-68bd-4ebe-b4a1-3e95ff36d483'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.695 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:41 compute-1 ovn_controller[95352]: 2026-01-26T08:45:41Z|00075|binding|INFO|Releasing lport cca059f9-68bd-4ebe-b4a1-3e95ff36d483 from this chassis (sb_readonly=0)
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.698 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fe7e4448-8407-46f2-95a0-344b2f6ecfd7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fe7e4448-8407-46f2-95a0-344b2f6ecfd7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.699 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[7da51257-063a-4dc6-8bad-799db73cff3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.700 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: global
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-fe7e4448-8407-46f2-95a0-344b2f6ecfd7
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/fe7e4448-8407-46f2-95a0-344b2f6ecfd7.pid.haproxy
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID fe7e4448-8407-46f2-95a0-344b2f6ecfd7
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 08:45:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:41.702 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'env', 'PROCESS_TAG=haproxy-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fe7e4448-8407-46f2-95a0-344b2f6ecfd7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.712 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.728 183087 DEBUG nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Removed pending event for 52d0b676-cf9c-4840-8b66-74ca8b13e2af due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.729 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417141.728106, 52d0b676-cf9c-4840-8b66-74ca8b13e2af => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.730 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] VM Resumed (Lifecycle Event)
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.734 183087 DEBUG nova.compute.manager [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.738 183087 INFO nova.virt.libvirt.driver [-] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Instance rebooted successfully.
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.739 183087 DEBUG nova.compute.manager [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.754 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.760 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.800 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.800 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417141.7305171, 52d0b676-cf9c-4840-8b66-74ca8b13e2af => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.801 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] VM Started (Lifecycle Event)
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.834 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.838 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:45:41 compute-1 nova_compute[183083]: 2026-01-26 08:45:41.845 183087 DEBUG oslo_concurrency.lockutils [None req-50f69af4-8993-4e1e-a11f-ed35f17887eb 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 4.917s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:42 compute-1 nova_compute[183083]: 2026-01-26 08:45:42.104 183087 DEBUG nova.network.neutron [req-9a48d4f1-cf92-4e0d-a0e1-be231b159ec7 req-566eed99-c870-45d5-a123-58ffa06ef6d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Updated VIF entry in instance network info cache for port bde3b3f7-f517-429f-a05a-4c2dd76dc0ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:45:42 compute-1 nova_compute[183083]: 2026-01-26 08:45:42.105 183087 DEBUG nova.network.neutron [req-9a48d4f1-cf92-4e0d-a0e1-be231b159ec7 req-566eed99-c870-45d5-a123-58ffa06ef6d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Updating instance_info_cache with network_info: [{"id": "bde3b3f7-f517-429f-a05a-4c2dd76dc0ae", "address": "fa:16:3e:8f:42:3a", "network": {"id": "f7e94474-b188-4696-a974-b6f64972f94c", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.166", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:2::/64", "dns": [], "gateway": {"address": "2001:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:2::81", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde3b3f7-f5", "ovs_interfaceid": "bde3b3f7-f517-429f-a05a-4c2dd76dc0ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:42 compute-1 nova_compute[183083]: 2026-01-26 08:45:42.144 183087 DEBUG oslo_concurrency.lockutils [req-9a48d4f1-cf92-4e0d-a0e1-be231b159ec7 req-566eed99-c870-45d5-a123-58ffa06ef6d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-31c86c5f-fd35-45f8-af2c-165a04ff46dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:45:42 compute-1 podman[213436]: 2026-01-26 08:45:42.22214513 +0000 UTC m=+0.073515836 container create 9eb12efaa320ec80d462a95e6dbbb3a96ba65c241d7453e492434d0930f7c917 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 08:45:42 compute-1 systemd[1]: Started libpod-conmon-9eb12efaa320ec80d462a95e6dbbb3a96ba65c241d7453e492434d0930f7c917.scope.
Jan 26 08:45:42 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:45:42 compute-1 podman[213436]: 2026-01-26 08:45:42.190192816 +0000 UTC m=+0.041563542 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 08:45:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b01e9d8ab6e939c0f90240516d55490f615245ad2f8a2f9ad5a3d4beffcdc510/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 08:45:42 compute-1 podman[213436]: 2026-01-26 08:45:42.309230505 +0000 UTC m=+0.160601211 container init 9eb12efaa320ec80d462a95e6dbbb3a96ba65c241d7453e492434d0930f7c917 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:45:42 compute-1 podman[213436]: 2026-01-26 08:45:42.316709638 +0000 UTC m=+0.168080334 container start 9eb12efaa320ec80d462a95e6dbbb3a96ba65c241d7453e492434d0930f7c917 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 08:45:42 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[213452]: [NOTICE]   (213456) : New worker (213458) forked
Jan 26 08:45:42 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[213452]: [NOTICE]   (213456) : Loading success.
Jan 26 08:45:42 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:42.925 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:43 compute-1 nova_compute[183083]: 2026-01-26 08:45:43.733 183087 DEBUG nova.network.neutron [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:43 compute-1 nova_compute[183083]: 2026-01-26 08:45:43.748 183087 INFO nova.compute.manager [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 31c86c5f-fd35-45f8-af2c-165a04ff46dc] Took 3.32 seconds to deallocate network for instance.
Jan 26 08:45:43 compute-1 nova_compute[183083]: 2026-01-26 08:45:43.884 183087 DEBUG nova.network.neutron [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Successfully updated port: 1d4d65f6-7230-441c-84cd-576c0a0a4ff3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:45:43 compute-1 nova_compute[183083]: 2026-01-26 08:45:43.888 183087 INFO nova.scheduler.client.report [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Deleted allocations for instance 31c86c5f-fd35-45f8-af2c-165a04ff46dc
Jan 26 08:45:43 compute-1 nova_compute[183083]: 2026-01-26 08:45:43.888 183087 DEBUG oslo_concurrency.lockutils [None req-265402e3-5d59-4462-b5a8-a00b055820c5 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "31c86c5f-fd35-45f8-af2c-165a04ff46dc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.567s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:43 compute-1 nova_compute[183083]: 2026-01-26 08:45:43.895 183087 DEBUG oslo_concurrency.lockutils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "refresh_cache-1d002ced-fc02-4688-aa6b-6d11514e01ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:45:43 compute-1 nova_compute[183083]: 2026-01-26 08:45:43.895 183087 DEBUG oslo_concurrency.lockutils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquired lock "refresh_cache-1d002ced-fc02-4688-aa6b-6d11514e01ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:45:43 compute-1 nova_compute[183083]: 2026-01-26 08:45:43.895 183087 DEBUG nova.network.neutron [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:45:44 compute-1 nova_compute[183083]: 2026-01-26 08:45:44.089 183087 DEBUG nova.network.neutron [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:45:44 compute-1 nova_compute[183083]: 2026-01-26 08:45:44.095 183087 DEBUG nova.compute.manager [req-b7b6cc9a-8949-4932-96a4-563d2b629c61 req-edd233b9-999f-40b4-a65d-1ea449231272 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Received event network-changed-1d4d65f6-7230-441c-84cd-576c0a0a4ff3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:44 compute-1 nova_compute[183083]: 2026-01-26 08:45:44.095 183087 DEBUG nova.compute.manager [req-b7b6cc9a-8949-4932-96a4-563d2b629c61 req-edd233b9-999f-40b4-a65d-1ea449231272 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Refreshing instance network info cache due to event network-changed-1d4d65f6-7230-441c-84cd-576c0a0a4ff3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:45:44 compute-1 nova_compute[183083]: 2026-01-26 08:45:44.096 183087 DEBUG oslo_concurrency.lockutils [req-b7b6cc9a-8949-4932-96a4-563d2b629c61 req-edd233b9-999f-40b4-a65d-1ea449231272 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-1d002ced-fc02-4688-aa6b-6d11514e01ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:45:44 compute-1 nova_compute[183083]: 2026-01-26 08:45:44.496 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.145 183087 DEBUG nova.network.neutron [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Updating instance_info_cache with network_info: [{"id": "1d4d65f6-7230-441c-84cd-576c0a0a4ff3", "address": "fa:16:3e:18:7d:83", "network": {"id": "39c9728f-a068-4279-b571-5f02fe11e43b", "bridge": "br-int", "label": "tempest-test-network--1653069714", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.165", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d4d65f6-72", "ovs_interfaceid": "1d4d65f6-7230-441c-84cd-576c0a0a4ff3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.174 183087 DEBUG oslo_concurrency.lockutils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Releasing lock "refresh_cache-1d002ced-fc02-4688-aa6b-6d11514e01ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.174 183087 DEBUG nova.compute.manager [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Instance network_info: |[{"id": "1d4d65f6-7230-441c-84cd-576c0a0a4ff3", "address": "fa:16:3e:18:7d:83", "network": {"id": "39c9728f-a068-4279-b571-5f02fe11e43b", "bridge": "br-int", "label": "tempest-test-network--1653069714", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.165", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d4d65f6-72", "ovs_interfaceid": "1d4d65f6-7230-441c-84cd-576c0a0a4ff3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.174 183087 DEBUG oslo_concurrency.lockutils [req-b7b6cc9a-8949-4932-96a4-563d2b629c61 req-edd233b9-999f-40b4-a65d-1ea449231272 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-1d002ced-fc02-4688-aa6b-6d11514e01ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.175 183087 DEBUG nova.network.neutron [req-b7b6cc9a-8949-4932-96a4-563d2b629c61 req-edd233b9-999f-40b4-a65d-1ea449231272 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Refreshing network info cache for port 1d4d65f6-7230-441c-84cd-576c0a0a4ff3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.176 183087 INFO nova.compute.manager [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Terminating instance
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.178 183087 DEBUG nova.compute.manager [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.184 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.185 183087 INFO nova.virt.libvirt.driver [-] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Instance destroyed successfully.
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.186 183087 DEBUG nova.virt.libvirt.vif [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:45:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_bw_limit_east_west-1064509822',display_name='tempest-test_bw_limit_east_west-1064509822',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-bw-limit-east-west-1064509822',id=14,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHVjp+2pOh+xYUkttf/EHrrYAH3LBOn+IKLzf3fiQpiaJslqkY+OmJn6bfd2cX/NEPdTL45qAcY0Zt6OwZRQXbHCoOcvnydr7uXjZCoGXOxoNL1bEhwXU4AaOmmyDzyYAA==',key_name='tempest-keypair-test-1026532318',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b71ae2b9d2fd454b8b3b9aa1a0e5c7e4',ramdisk_id='',reservation_id='r-stf64v8r',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-374727467',owner_user_name='tempest-QosTestCommon-374727467-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:45:39Z,user_data=None,user_id='a7abeebb4e4d469c91e6cee77f6be1c3',uuid=1d002ced-fc02-4688-aa6b-6d11514e01ca,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1d4d65f6-7230-441c-84cd-576c0a0a4ff3", "address": "fa:16:3e:18:7d:83", "network": {"id": "39c9728f-a068-4279-b571-5f02fe11e43b", "bridge": "br-int", "label": "tempest-test-network--1653069714", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.165", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d4d65f6-72", "ovs_interfaceid": "1d4d65f6-7230-441c-84cd-576c0a0a4ff3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.187 183087 DEBUG nova.network.os_vif_util [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converting VIF {"id": "1d4d65f6-7230-441c-84cd-576c0a0a4ff3", "address": "fa:16:3e:18:7d:83", "network": {"id": "39c9728f-a068-4279-b571-5f02fe11e43b", "bridge": "br-int", "label": "tempest-test-network--1653069714", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.165", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d4d65f6-72", "ovs_interfaceid": "1d4d65f6-7230-441c-84cd-576c0a0a4ff3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.188 183087 DEBUG nova.network.os_vif_util [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:7d:83,bridge_name='br-int',has_traffic_filtering=True,id=1d4d65f6-7230-441c-84cd-576c0a0a4ff3,network=Network(39c9728f-a068-4279-b571-5f02fe11e43b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d4d65f6-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.189 183087 DEBUG os_vif [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:7d:83,bridge_name='br-int',has_traffic_filtering=True,id=1d4d65f6-7230-441c-84cd-576c0a0a4ff3,network=Network(39c9728f-a068-4279-b571-5f02fe11e43b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d4d65f6-72') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.191 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.191 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d4d65f6-72, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.192 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.194 183087 INFO os_vif [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:7d:83,bridge_name='br-int',has_traffic_filtering=True,id=1d4d65f6-7230-441c-84cd-576c0a0a4ff3,network=Network(39c9728f-a068-4279-b571-5f02fe11e43b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d4d65f6-72')
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.195 183087 INFO nova.virt.libvirt.driver [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Deleting instance files /var/lib/nova/instances/1d002ced-fc02-4688-aa6b-6d11514e01ca_del
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.195 183087 INFO nova.virt.libvirt.driver [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Deletion of /var/lib/nova/instances/1d002ced-fc02-4688-aa6b-6d11514e01ca_del complete
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.259 183087 INFO nova.compute.manager [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Took 0.08 seconds to destroy the instance on the hypervisor.
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.261 183087 DEBUG nova.compute.claims [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Aborting claim: <nova.compute.claims.Claim object at 0x7f6cb868baf0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.262 183087 DEBUG oslo_concurrency.lockutils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.262 183087 DEBUG oslo_concurrency.lockutils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.402 183087 DEBUG nova.compute.provider_tree [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.432 183087 DEBUG nova.scheduler.client.report [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.458 183087 DEBUG oslo_concurrency.lockutils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.459 183087 DEBUG nova.compute.utils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.460 183087 ERROR nova.compute.manager [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Build of instance 1d002ced-fc02-4688-aa6b-6d11514e01ca aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 1d002ced-fc02-4688-aa6b-6d11514e01ca aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.461 183087 DEBUG nova.compute.manager [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.462 183087 DEBUG nova.virt.libvirt.vif [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:45:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_bw_limit_east_west-1064509822',display_name='tempest-test_bw_limit_east_west-1064509822',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-test-bw-limit-east-west-1064509822',id=14,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHVjp+2pOh+xYUkttf/EHrrYAH3LBOn+IKLzf3fiQpiaJslqkY+OmJn6bfd2cX/NEPdTL45qAcY0Zt6OwZRQXbHCoOcvnydr7uXjZCoGXOxoNL1bEhwXU4AaOmmyDzyYAA==',key_name='tempest-keypair-test-1026532318',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b71ae2b9d2fd454b8b3b9aa1a0e5c7e4',ramdisk_id='',reservation_id='r-stf64v8r',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-374727467',owner_user_name='tempest-QosTestCommon-374727467-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:45:45Z,user_data=None,user_id='a7abeebb4e4d469c91e6cee77f6be1c3',uuid=1d002ced-fc02-4688-aa6b-6d11514e01ca,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1d4d65f6-7230-441c-84cd-576c0a0a4ff3", "address": "fa:16:3e:18:7d:83", "network": {"id": "39c9728f-a068-4279-b571-5f02fe11e43b", "bridge": "br-int", "label": "tempest-test-network--1653069714", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.165", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d4d65f6-72", "ovs_interfaceid": "1d4d65f6-7230-441c-84cd-576c0a0a4ff3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.463 183087 DEBUG nova.network.os_vif_util [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converting VIF {"id": "1d4d65f6-7230-441c-84cd-576c0a0a4ff3", "address": "fa:16:3e:18:7d:83", "network": {"id": "39c9728f-a068-4279-b571-5f02fe11e43b", "bridge": "br-int", "label": "tempest-test-network--1653069714", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.165", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d4d65f6-72", "ovs_interfaceid": "1d4d65f6-7230-441c-84cd-576c0a0a4ff3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.464 183087 DEBUG nova.network.os_vif_util [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:7d:83,bridge_name='br-int',has_traffic_filtering=True,id=1d4d65f6-7230-441c-84cd-576c0a0a4ff3,network=Network(39c9728f-a068-4279-b571-5f02fe11e43b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d4d65f6-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.464 183087 DEBUG os_vif [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:7d:83,bridge_name='br-int',has_traffic_filtering=True,id=1d4d65f6-7230-441c-84cd-576c0a0a4ff3,network=Network(39c9728f-a068-4279-b571-5f02fe11e43b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d4d65f6-72') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.466 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.467 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d4d65f6-72, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.467 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.470 183087 INFO os_vif [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:7d:83,bridge_name='br-int',has_traffic_filtering=True,id=1d4d65f6-7230-441c-84cd-576c0a0a4ff3,network=Network(39c9728f-a068-4279-b571-5f02fe11e43b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d4d65f6-72')
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.470 183087 DEBUG nova.compute.manager [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.471 183087 DEBUG nova.compute.manager [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:45:45 compute-1 nova_compute[183083]: 2026-01-26 08:45:45.471 183087 DEBUG nova.network.neutron [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:45:45 compute-1 podman[213468]: 2026-01-26 08:45:45.849862355 +0000 UTC m=+0.102272517 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, architecture=x86_64, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container)
Jan 26 08:45:45 compute-1 podman[213467]: 2026-01-26 08:45:45.857036849 +0000 UTC m=+0.110266514 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 26 08:45:46 compute-1 nova_compute[183083]: 2026-01-26 08:45:46.179 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:46 compute-1 nova_compute[183083]: 2026-01-26 08:45:46.785 183087 DEBUG nova.network.neutron [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:46 compute-1 nova_compute[183083]: 2026-01-26 08:45:46.810 183087 INFO nova.compute.manager [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Took 1.34 seconds to deallocate network for instance.
Jan 26 08:45:47 compute-1 nova_compute[183083]: 2026-01-26 08:45:47.014 183087 INFO nova.scheduler.client.report [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Deleted allocations for instance 1d002ced-fc02-4688-aa6b-6d11514e01ca
Jan 26 08:45:47 compute-1 nova_compute[183083]: 2026-01-26 08:45:47.015 183087 DEBUG oslo_concurrency.lockutils [None req-94a5671d-1785-498b-96dc-4997a0c326f8 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "1d002ced-fc02-4688-aa6b-6d11514e01ca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.468s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:47 compute-1 nova_compute[183083]: 2026-01-26 08:45:47.238 183087 DEBUG nova.network.neutron [req-b7b6cc9a-8949-4932-96a4-563d2b629c61 req-edd233b9-999f-40b4-a65d-1ea449231272 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Updated VIF entry in instance network info cache for port 1d4d65f6-7230-441c-84cd-576c0a0a4ff3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:45:47 compute-1 nova_compute[183083]: 2026-01-26 08:45:47.239 183087 DEBUG nova.network.neutron [req-b7b6cc9a-8949-4932-96a4-563d2b629c61 req-edd233b9-999f-40b4-a65d-1ea449231272 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 1d002ced-fc02-4688-aa6b-6d11514e01ca] Updating instance_info_cache with network_info: [{"id": "1d4d65f6-7230-441c-84cd-576c0a0a4ff3", "address": "fa:16:3e:18:7d:83", "network": {"id": "39c9728f-a068-4279-b571-5f02fe11e43b", "bridge": "br-int", "label": "tempest-test-network--1653069714", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.165", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d4d65f6-72", "ovs_interfaceid": "1d4d65f6-7230-441c-84cd-576c0a0a4ff3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:45:47 compute-1 nova_compute[183083]: 2026-01-26 08:45:47.328 183087 DEBUG oslo_concurrency.lockutils [req-b7b6cc9a-8949-4932-96a4-563d2b629c61 req-edd233b9-999f-40b4-a65d-1ea449231272 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-1d002ced-fc02-4688-aa6b-6d11514e01ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:45:48 compute-1 nova_compute[183083]: 2026-01-26 08:45:48.567 183087 DEBUG nova.compute.manager [req-bbe7046d-aeef-405f-a305-91df6f970186 req-a8e8116e-6c1e-4cce-a5f5-9a243c960cc6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Received event network-vif-plugged-30b45c93-b3bb-44e6-8e4a-6903a631c773 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:48 compute-1 nova_compute[183083]: 2026-01-26 08:45:48.569 183087 DEBUG oslo_concurrency.lockutils [req-bbe7046d-aeef-405f-a305-91df6f970186 req-a8e8116e-6c1e-4cce-a5f5-9a243c960cc6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:48 compute-1 nova_compute[183083]: 2026-01-26 08:45:48.570 183087 DEBUG oslo_concurrency.lockutils [req-bbe7046d-aeef-405f-a305-91df6f970186 req-a8e8116e-6c1e-4cce-a5f5-9a243c960cc6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:48 compute-1 nova_compute[183083]: 2026-01-26 08:45:48.571 183087 DEBUG oslo_concurrency.lockutils [req-bbe7046d-aeef-405f-a305-91df6f970186 req-a8e8116e-6c1e-4cce-a5f5-9a243c960cc6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:48 compute-1 nova_compute[183083]: 2026-01-26 08:45:48.571 183087 DEBUG nova.compute.manager [req-bbe7046d-aeef-405f-a305-91df6f970186 req-a8e8116e-6c1e-4cce-a5f5-9a243c960cc6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Processing event network-vif-plugged-30b45c93-b3bb-44e6-8e4a-6903a631c773 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.129 183087 DEBUG oslo_concurrency.lockutils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Acquiring lock "2d7929e0-5231-4515-b513-bde34026aca7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.130 183087 DEBUG oslo_concurrency.lockutils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "2d7929e0-5231-4515-b513-bde34026aca7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.150 183087 DEBUG nova.compute.manager [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.278 183087 DEBUG oslo_concurrency.lockutils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.279 183087 DEBUG oslo_concurrency.lockutils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.289 183087 DEBUG nova.virt.hardware [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.290 183087 INFO nova.compute.claims [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.498 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.529 183087 DEBUG nova.compute.provider_tree [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.552 183087 DEBUG nova.scheduler.client.report [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.587 183087 DEBUG oslo_concurrency.lockutils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.588 183087 DEBUG nova.compute.manager [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.646 183087 DEBUG nova.compute.manager [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.647 183087 DEBUG nova.network.neutron [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.670 183087 INFO nova.virt.libvirt.driver [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.689 183087 DEBUG nova.compute.manager [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.772 183087 DEBUG nova.compute.manager [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.774 183087 DEBUG nova.virt.libvirt.driver [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.775 183087 INFO nova.virt.libvirt.driver [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Creating image(s)
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.776 183087 DEBUG oslo_concurrency.lockutils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Acquiring lock "/var/lib/nova/instances/2d7929e0-5231-4515-b513-bde34026aca7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.777 183087 DEBUG oslo_concurrency.lockutils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "/var/lib/nova/instances/2d7929e0-5231-4515-b513-bde34026aca7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.778 183087 DEBUG oslo_concurrency.lockutils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "/var/lib/nova/instances/2d7929e0-5231-4515-b513-bde34026aca7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.779 183087 DEBUG oslo_concurrency.lockutils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:49 compute-1 nova_compute[183083]: 2026-01-26 08:45:49.780 183087 DEBUG oslo_concurrency.lockutils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.212 183087 DEBUG nova.policy [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'add713470fcc438f95ec0ff89dbb2adc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a3694415e0ac483fa070e7316b146fc1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.739 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.794 183087 DEBUG nova.compute.manager [req-7f2f677a-bad3-4865-b0e1-6e373d456352 req-44af3119-24ed-4f31-80ff-988f5734e113 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Received event network-vif-plugged-30b45c93-b3bb-44e6-8e4a-6903a631c773 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.795 183087 DEBUG oslo_concurrency.lockutils [req-7f2f677a-bad3-4865-b0e1-6e373d456352 req-44af3119-24ed-4f31-80ff-988f5734e113 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.796 183087 DEBUG oslo_concurrency.lockutils [req-7f2f677a-bad3-4865-b0e1-6e373d456352 req-44af3119-24ed-4f31-80ff-988f5734e113 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.796 183087 DEBUG oslo_concurrency.lockutils [req-7f2f677a-bad3-4865-b0e1-6e373d456352 req-44af3119-24ed-4f31-80ff-988f5734e113 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.797 183087 DEBUG nova.compute.manager [req-7f2f677a-bad3-4865-b0e1-6e373d456352 req-44af3119-24ed-4f31-80ff-988f5734e113 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] No event matching network-vif-plugged-30b45c93-b3bb-44e6-8e4a-6903a631c773 in dict_keys([('network-vif-plugged', '71bea873-e0d4-4d4e-b05e-ff7415434c11')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.797 183087 WARNING nova.compute.manager [req-7f2f677a-bad3-4865-b0e1-6e373d456352 req-44af3119-24ed-4f31-80ff-988f5734e113 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Received unexpected event network-vif-plugged-30b45c93-b3bb-44e6-8e4a-6903a631c773 for instance with vm_state building and task_state spawning.
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.798 183087 DEBUG nova.compute.manager [req-7f2f677a-bad3-4865-b0e1-6e373d456352 req-44af3119-24ed-4f31-80ff-988f5734e113 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Received event network-vif-plugged-71bea873-e0d4-4d4e-b05e-ff7415434c11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.799 183087 DEBUG oslo_concurrency.lockutils [req-7f2f677a-bad3-4865-b0e1-6e373d456352 req-44af3119-24ed-4f31-80ff-988f5734e113 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.799 183087 DEBUG oslo_concurrency.lockutils [req-7f2f677a-bad3-4865-b0e1-6e373d456352 req-44af3119-24ed-4f31-80ff-988f5734e113 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.800 183087 DEBUG oslo_concurrency.lockutils [req-7f2f677a-bad3-4865-b0e1-6e373d456352 req-44af3119-24ed-4f31-80ff-988f5734e113 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.801 183087 DEBUG nova.compute.manager [req-7f2f677a-bad3-4865-b0e1-6e373d456352 req-44af3119-24ed-4f31-80ff-988f5734e113 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Processing event network-vif-plugged-71bea873-e0d4-4d4e-b05e-ff7415434c11 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.801 183087 DEBUG nova.compute.manager [req-7f2f677a-bad3-4865-b0e1-6e373d456352 req-44af3119-24ed-4f31-80ff-988f5734e113 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Received event network-vif-plugged-71bea873-e0d4-4d4e-b05e-ff7415434c11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.802 183087 DEBUG oslo_concurrency.lockutils [req-7f2f677a-bad3-4865-b0e1-6e373d456352 req-44af3119-24ed-4f31-80ff-988f5734e113 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.802 183087 DEBUG oslo_concurrency.lockutils [req-7f2f677a-bad3-4865-b0e1-6e373d456352 req-44af3119-24ed-4f31-80ff-988f5734e113 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.803 183087 DEBUG oslo_concurrency.lockutils [req-7f2f677a-bad3-4865-b0e1-6e373d456352 req-44af3119-24ed-4f31-80ff-988f5734e113 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.804 183087 DEBUG nova.compute.manager [req-7f2f677a-bad3-4865-b0e1-6e373d456352 req-44af3119-24ed-4f31-80ff-988f5734e113 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] No waiting events found dispatching network-vif-plugged-71bea873-e0d4-4d4e-b05e-ff7415434c11 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.804 183087 WARNING nova.compute.manager [req-7f2f677a-bad3-4865-b0e1-6e373d456352 req-44af3119-24ed-4f31-80ff-988f5734e113 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Received unexpected event network-vif-plugged-71bea873-e0d4-4d4e-b05e-ff7415434c11 for instance with vm_state building and task_state spawning.
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.805 183087 DEBUG nova.compute.manager [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Instance event wait completed in 12 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.811 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417150.8112085, 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.812 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] VM Resumed (Lifecycle Event)
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.824 183087 DEBUG nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.830 183087 INFO nova.virt.libvirt.driver [-] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Instance spawned successfully.
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.831 183087 DEBUG nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.838 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.845 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.871 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.880 183087 DEBUG nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.880 183087 DEBUG nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.881 183087 DEBUG nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.882 183087 DEBUG nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.883 183087 DEBUG nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.883 183087 DEBUG nova.virt.libvirt.driver [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.939 183087 DEBUG oslo_concurrency.lockutils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Traceback (most recent call last):
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]     raise exception.ImageUnacceptable(
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7] 
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7] During handling of the above exception, another exception occurred:
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7] 
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Traceback (most recent call last):
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]     yield resources
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]     created_disks = self._create_and_inject_local_root(
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]     image.cache(fetch_func=fetch_func,
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]     return f(*args, **kwargs)
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7]     raise exception.ImageUnacceptable(
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.940 183087 ERROR nova.compute.manager [instance: 2d7929e0-5231-4515-b513-bde34026aca7] 
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.947 183087 INFO nova.compute.manager [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Took 23.19 seconds to spawn the instance on the hypervisor.
Jan 26 08:45:50 compute-1 nova_compute[183083]: 2026-01-26 08:45:50.947 183087 DEBUG nova.compute.manager [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:45:51 compute-1 nova_compute[183083]: 2026-01-26 08:45:51.009 183087 INFO nova.compute.manager [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Took 23.81 seconds to build instance.
Jan 26 08:45:51 compute-1 nova_compute[183083]: 2026-01-26 08:45:51.030 183087 DEBUG oslo_concurrency.lockutils [None req-50b42886-b56b-471d-a4b6-517fb8366895 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 24.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:51 compute-1 ovn_controller[95352]: 2026-01-26T08:45:51Z|00076|binding|INFO|Releasing lport cca059f9-68bd-4ebe-b4a1-3e95ff36d483 from this chassis (sb_readonly=0)
Jan 26 08:45:51 compute-1 ovn_controller[95352]: 2026-01-26T08:45:51Z|00077|binding|INFO|Releasing lport 809259ab-8ea4-4909-92b4-4ee536a51482 from this chassis (sb_readonly=0)
Jan 26 08:45:51 compute-1 ovn_controller[95352]: 2026-01-26T08:45:51Z|00078|binding|INFO|Releasing lport bfb9744a-58f0-4145-9e28-9c13225b3407 from this chassis (sb_readonly=0)
Jan 26 08:45:51 compute-1 nova_compute[183083]: 2026-01-26 08:45:51.182 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:51 compute-1 nova_compute[183083]: 2026-01-26 08:45:51.192 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:51 compute-1 nova_compute[183083]: 2026-01-26 08:45:51.552 183087 INFO nova.compute.manager [None req-a155f6a0-e44e-47c0-a728-4c49ea87250b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Get console output
Jan 26 08:45:51 compute-1 nova_compute[183083]: 2026-01-26 08:45:51.559 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:45:52 compute-1 podman[213513]: 2026-01-26 08:45:52.81614917 +0000 UTC m=+0.071142392 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 08:45:52 compute-1 podman[213512]: 2026-01-26 08:45:52.848337185 +0000 UTC m=+0.105625752 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:45:53 compute-1 ovn_controller[95352]: 2026-01-26T08:45:53Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:48:29:f7 10.100.0.24
Jan 26 08:45:54 compute-1 nova_compute[183083]: 2026-01-26 08:45:54.502 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:54 compute-1 podman[213560]: 2026-01-26 08:45:54.829695311 +0000 UTC m=+0.090139762 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 26 08:45:55 compute-1 sshd-session[213579]: Connection closed by 220.129.152.100 port 48532
Jan 26 08:45:56 compute-1 nova_compute[183083]: 2026-01-26 08:45:56.185 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:56 compute-1 nova_compute[183083]: 2026-01-26 08:45:56.833 183087 INFO nova.compute.manager [None req-3c028aae-4b83-4ff6-8baa-71af54d47718 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Get console output
Jan 26 08:45:56 compute-1 nova_compute[183083]: 2026-01-26 08:45:56.838 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:45:56 compute-1 nova_compute[183083]: 2026-01-26 08:45:56.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:45:56 compute-1 nova_compute[183083]: 2026-01-26 08:45:56.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:45:56 compute-1 nova_compute[183083]: 2026-01-26 08:45:56.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:45:57 compute-1 sshd-session[213580]: Invalid user a from 220.129.152.100 port 48534
Jan 26 08:45:57 compute-1 nova_compute[183083]: 2026-01-26 08:45:57.216 183087 DEBUG nova.compute.manager [req-6716cd3c-da8e-4b14-b7b6-3f138ee0950f req-101b3058-ab6e-4d33-ad93-416ca089b093 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received event network-vif-unplugged-fe874242-d6f2-4922-8e79-b6545c0e8446 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:57 compute-1 nova_compute[183083]: 2026-01-26 08:45:57.217 183087 DEBUG oslo_concurrency.lockutils [req-6716cd3c-da8e-4b14-b7b6-3f138ee0950f req-101b3058-ab6e-4d33-ad93-416ca089b093 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:57 compute-1 nova_compute[183083]: 2026-01-26 08:45:57.217 183087 DEBUG oslo_concurrency.lockutils [req-6716cd3c-da8e-4b14-b7b6-3f138ee0950f req-101b3058-ab6e-4d33-ad93-416ca089b093 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:57 compute-1 nova_compute[183083]: 2026-01-26 08:45:57.218 183087 DEBUG oslo_concurrency.lockutils [req-6716cd3c-da8e-4b14-b7b6-3f138ee0950f req-101b3058-ab6e-4d33-ad93-416ca089b093 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:57 compute-1 nova_compute[183083]: 2026-01-26 08:45:57.218 183087 DEBUG nova.compute.manager [req-6716cd3c-da8e-4b14-b7b6-3f138ee0950f req-101b3058-ab6e-4d33-ad93-416ca089b093 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] No waiting events found dispatching network-vif-unplugged-fe874242-d6f2-4922-8e79-b6545c0e8446 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:45:57 compute-1 nova_compute[183083]: 2026-01-26 08:45:57.218 183087 WARNING nova.compute.manager [req-6716cd3c-da8e-4b14-b7b6-3f138ee0950f req-101b3058-ab6e-4d33-ad93-416ca089b093 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received unexpected event network-vif-unplugged-fe874242-d6f2-4922-8e79-b6545c0e8446 for instance with vm_state active and task_state None.
Jan 26 08:45:57 compute-1 sshd-session[213580]: Connection closed by invalid user a 220.129.152.100 port 48534 [preauth]
Jan 26 08:45:57 compute-1 nova_compute[183083]: 2026-01-26 08:45:57.592 183087 DEBUG nova.network.neutron [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Successfully updated port: fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:45:57 compute-1 nova_compute[183083]: 2026-01-26 08:45:57.608 183087 DEBUG oslo_concurrency.lockutils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Acquiring lock "refresh_cache-2d7929e0-5231-4515-b513-bde34026aca7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:45:57 compute-1 nova_compute[183083]: 2026-01-26 08:45:57.609 183087 DEBUG oslo_concurrency.lockutils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Acquired lock "refresh_cache-2d7929e0-5231-4515-b513-bde34026aca7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:45:57 compute-1 nova_compute[183083]: 2026-01-26 08:45:57.609 183087 DEBUG nova.network.neutron [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:45:57 compute-1 nova_compute[183083]: 2026-01-26 08:45:57.639 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:57 compute-1 nova_compute[183083]: 2026-01-26 08:45:57.750 183087 DEBUG nova.compute.manager [req-5d0a3321-eeb0-40e5-839f-5975853a17b1 req-6ffc3402-8216-4c49-9947-e32605849340 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Received event network-changed-fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:57 compute-1 nova_compute[183083]: 2026-01-26 08:45:57.751 183087 DEBUG nova.compute.manager [req-5d0a3321-eeb0-40e5-839f-5975853a17b1 req-6ffc3402-8216-4c49-9947-e32605849340 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Refreshing instance network info cache due to event network-changed-fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:45:57 compute-1 nova_compute[183083]: 2026-01-26 08:45:57.751 183087 DEBUG oslo_concurrency.lockutils [req-5d0a3321-eeb0-40e5-839f-5975853a17b1 req-6ffc3402-8216-4c49-9947-e32605849340 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-2d7929e0-5231-4515-b513-bde34026aca7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:45:57 compute-1 nova_compute[183083]: 2026-01-26 08:45:57.915 183087 DEBUG nova.network.neutron [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:45:58 compute-1 nova_compute[183083]: 2026-01-26 08:45:58.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:45:58 compute-1 nova_compute[183083]: 2026-01-26 08:45:58.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:45:58 compute-1 nova_compute[183083]: 2026-01-26 08:45:58.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:45:58 compute-1 nova_compute[183083]: 2026-01-26 08:45:58.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:45:58 compute-1 nova_compute[183083]: 2026-01-26 08:45:58.978 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.343 183087 DEBUG nova.compute.manager [req-3137b291-7afb-4107-b3d6-5de2ef432865 req-c2cc1755-c2da-42d0-958b-355bfdd5f0c7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.344 183087 DEBUG oslo_concurrency.lockutils [req-3137b291-7afb-4107-b3d6-5de2ef432865 req-c2cc1755-c2da-42d0-958b-355bfdd5f0c7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.344 183087 DEBUG oslo_concurrency.lockutils [req-3137b291-7afb-4107-b3d6-5de2ef432865 req-c2cc1755-c2da-42d0-958b-355bfdd5f0c7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.344 183087 DEBUG oslo_concurrency.lockutils [req-3137b291-7afb-4107-b3d6-5de2ef432865 req-c2cc1755-c2da-42d0-958b-355bfdd5f0c7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.344 183087 DEBUG nova.compute.manager [req-3137b291-7afb-4107-b3d6-5de2ef432865 req-c2cc1755-c2da-42d0-958b-355bfdd5f0c7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] No waiting events found dispatching network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.345 183087 WARNING nova.compute.manager [req-3137b291-7afb-4107-b3d6-5de2ef432865 req-c2cc1755-c2da-42d0-958b-355bfdd5f0c7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received unexpected event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 for instance with vm_state active and task_state None.
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.345 183087 DEBUG nova.compute.manager [req-3137b291-7afb-4107-b3d6-5de2ef432865 req-c2cc1755-c2da-42d0-958b-355bfdd5f0c7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.345 183087 DEBUG oslo_concurrency.lockutils [req-3137b291-7afb-4107-b3d6-5de2ef432865 req-c2cc1755-c2da-42d0-958b-355bfdd5f0c7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.345 183087 DEBUG oslo_concurrency.lockutils [req-3137b291-7afb-4107-b3d6-5de2ef432865 req-c2cc1755-c2da-42d0-958b-355bfdd5f0c7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.345 183087 DEBUG oslo_concurrency.lockutils [req-3137b291-7afb-4107-b3d6-5de2ef432865 req-c2cc1755-c2da-42d0-958b-355bfdd5f0c7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.346 183087 DEBUG nova.compute.manager [req-3137b291-7afb-4107-b3d6-5de2ef432865 req-c2cc1755-c2da-42d0-958b-355bfdd5f0c7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] No waiting events found dispatching network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.346 183087 WARNING nova.compute.manager [req-3137b291-7afb-4107-b3d6-5de2ef432865 req-c2cc1755-c2da-42d0-958b-355bfdd5f0c7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received unexpected event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 for instance with vm_state active and task_state None.
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.346 183087 DEBUG nova.compute.manager [req-3137b291-7afb-4107-b3d6-5de2ef432865 req-c2cc1755-c2da-42d0-958b-355bfdd5f0c7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.346 183087 DEBUG oslo_concurrency.lockutils [req-3137b291-7afb-4107-b3d6-5de2ef432865 req-c2cc1755-c2da-42d0-958b-355bfdd5f0c7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.346 183087 DEBUG oslo_concurrency.lockutils [req-3137b291-7afb-4107-b3d6-5de2ef432865 req-c2cc1755-c2da-42d0-958b-355bfdd5f0c7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.347 183087 DEBUG oslo_concurrency.lockutils [req-3137b291-7afb-4107-b3d6-5de2ef432865 req-c2cc1755-c2da-42d0-958b-355bfdd5f0c7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.347 183087 DEBUG nova.compute.manager [req-3137b291-7afb-4107-b3d6-5de2ef432865 req-c2cc1755-c2da-42d0-958b-355bfdd5f0c7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] No waiting events found dispatching network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.347 183087 WARNING nova.compute.manager [req-3137b291-7afb-4107-b3d6-5de2ef432865 req-c2cc1755-c2da-42d0-958b-355bfdd5f0c7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received unexpected event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 for instance with vm_state active and task_state None.
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.534 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.768 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.769 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquired lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.769 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.769 183087 DEBUG nova.objects.instance [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 52d0b676-cf9c-4840-8b66-74ca8b13e2af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.841 183087 DEBUG oslo_concurrency.lockutils [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquiring lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.841 183087 DEBUG oslo_concurrency.lockutils [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.841 183087 DEBUG oslo_concurrency.lockutils [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquiring lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.842 183087 DEBUG oslo_concurrency.lockutils [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.842 183087 DEBUG oslo_concurrency.lockutils [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.844 183087 INFO nova.compute.manager [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Terminating instance
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.845 183087 DEBUG nova.compute.manager [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:45:59 compute-1 kernel: tapfe874242-d6 (unregistering): left promiscuous mode
Jan 26 08:45:59 compute-1 NetworkManager[55451]: <info>  [1769417159.8705] device (tapfe874242-d6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.882 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:59 compute-1 ovn_controller[95352]: 2026-01-26T08:45:59Z|00079|binding|INFO|Releasing lport fe874242-d6f2-4922-8e79-b6545c0e8446 from this chassis (sb_readonly=0)
Jan 26 08:45:59 compute-1 ovn_controller[95352]: 2026-01-26T08:45:59Z|00080|binding|INFO|Setting lport fe874242-d6f2-4922-8e79-b6545c0e8446 down in Southbound
Jan 26 08:45:59 compute-1 ovn_controller[95352]: 2026-01-26T08:45:59Z|00081|binding|INFO|Removing iface tapfe874242-d6 ovn-installed in OVS
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.891 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:59.898 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:29:f7 10.100.0.24'], port_security=['fa:16:3e:48:29:f7 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': '52d0b676-cf9c-4840-8b66-74ca8b13e2af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '605ef5b310d9405faa10f9c8f78d897f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '18ebfc50-0648-455b-8cfa-279844a56dcf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.173'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70edddae-db7d-4a4d-8983-c3a06c961ec1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=fe874242-d6f2-4922-8e79-b6545c0e8446) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:45:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:59.900 104632 INFO neutron.agent.ovn.metadata.agent [-] Port fe874242-d6f2-4922-8e79-b6545c0e8446 in datapath fe7e4448-8407-46f2-95a0-344b2f6ecfd7 unbound from our chassis
Jan 26 08:45:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:59.903 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fe7e4448-8407-46f2-95a0-344b2f6ecfd7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 08:45:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:59.904 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[7a3c5e63-620f-449b-85b2-729dfc72ffb2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:45:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:45:59.905 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7 namespace which is not needed anymore
Jan 26 08:45:59 compute-1 nova_compute[183083]: 2026-01-26 08:45:59.907 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:45:59 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Deactivated successfully.
Jan 26 08:45:59 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Consumed 12.352s CPU time.
Jan 26 08:45:59 compute-1 systemd-machined[154360]: Machine qemu-4-instance-00000009 terminated.
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.032 183087 DEBUG nova.network.neutron [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Updating instance_info_cache with network_info: [{"id": "fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3", "address": "fa:16:3e:50:b0:4a", "network": {"id": "df9ffb3b-ce9c-4169-b3c6-54f35f96ced1", "bridge": "br-int", "label": "tempest-test-network--156007843", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.221", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3694415e0ac483fa070e7316b146fc1", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa92eb9f-99", "ovs_interfaceid": "fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.066 183087 DEBUG oslo_concurrency.lockutils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Releasing lock "refresh_cache-2d7929e0-5231-4515-b513-bde34026aca7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.067 183087 DEBUG nova.compute.manager [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Instance network_info: |[{"id": "fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3", "address": "fa:16:3e:50:b0:4a", "network": {"id": "df9ffb3b-ce9c-4169-b3c6-54f35f96ced1", "bridge": "br-int", "label": "tempest-test-network--156007843", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.221", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3694415e0ac483fa070e7316b146fc1", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa92eb9f-99", "ovs_interfaceid": "fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.069 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.070 183087 DEBUG oslo_concurrency.lockutils [req-5d0a3321-eeb0-40e5-839f-5975853a17b1 req-6ffc3402-8216-4c49-9947-e32605849340 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-2d7929e0-5231-4515-b513-bde34026aca7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.071 183087 DEBUG nova.network.neutron [req-5d0a3321-eeb0-40e5-839f-5975853a17b1 req-6ffc3402-8216-4c49-9947-e32605849340 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Refreshing network info cache for port fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.072 183087 INFO nova.compute.manager [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Terminating instance
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.074 183087 DEBUG nova.compute.manager [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.078 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.081 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.082 183087 INFO nova.virt.libvirt.driver [-] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Instance destroyed successfully.
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.083 183087 DEBUG nova.virt.libvirt.vif [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:45:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1756212357',display_name='tempest-server-test-1756212357',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1756212357',id=15,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMOAo9U7CQ6+MUYlZi/t9zlH/+rAp796cJ8WJs24Medi+Z15tpQDat3ArX4YiDk1KZuy6uq3/eoJDuJSWU3b/ZFp6PXjDw49aWrQKlvfg9FnwF0SFbwb28a5Q4WXO/+Jdg==',key_name='tempest-keypair-test-1873335547',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a3694415e0ac483fa070e7316b146fc1',ramdisk_id='',reservation_id='r-tozr9jmj',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestOvn-2033283083',owner_user_name='tempest-QosTestOvn-2033283083-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:45:49Z,user_data=None,user_id='add713470fcc438f95ec0ff89dbb2adc',uuid=2d7929e0-5231-4515-b513-bde34026aca7,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3", "address": "fa:16:3e:50:b0:4a", "network": {"id": "df9ffb3b-ce9c-4169-b3c6-54f35f96ced1", "bridge": "br-int", "label": "tempest-test-network--156007843", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.221", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3694415e0ac483fa070e7316b146fc1", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa92eb9f-99", "ovs_interfaceid": "fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.083 183087 DEBUG nova.network.os_vif_util [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Converting VIF {"id": "fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3", "address": "fa:16:3e:50:b0:4a", "network": {"id": "df9ffb3b-ce9c-4169-b3c6-54f35f96ced1", "bridge": "br-int", "label": "tempest-test-network--156007843", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.221", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3694415e0ac483fa070e7316b146fc1", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa92eb9f-99", "ovs_interfaceid": "fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.084 183087 DEBUG nova.network.os_vif_util [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:b0:4a,bridge_name='br-int',has_traffic_filtering=True,id=fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3,network=Network(df9ffb3b-ce9c-4169-b3c6-54f35f96ced1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfa92eb9f-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.085 183087 DEBUG os_vif [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:b0:4a,bridge_name='br-int',has_traffic_filtering=True,id=fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3,network=Network(df9ffb3b-ce9c-4169-b3c6-54f35f96ced1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfa92eb9f-99') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.089 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.090 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa92eb9f-99, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.090 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.103 183087 INFO os_vif [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:b0:4a,bridge_name='br-int',has_traffic_filtering=True,id=fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3,network=Network(df9ffb3b-ce9c-4169-b3c6-54f35f96ced1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfa92eb9f-99')
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.104 183087 INFO nova.virt.libvirt.driver [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Deleting instance files /var/lib/nova/instances/2d7929e0-5231-4515-b513-bde34026aca7_del
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.107 183087 INFO nova.virt.libvirt.driver [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Deletion of /var/lib/nova/instances/2d7929e0-5231-4515-b513-bde34026aca7_del complete
Jan 26 08:46:00 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[213452]: [NOTICE]   (213456) : haproxy version is 2.8.14-c23fe91
Jan 26 08:46:00 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[213452]: [NOTICE]   (213456) : path to executable is /usr/sbin/haproxy
Jan 26 08:46:00 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[213452]: [WARNING]  (213456) : Exiting Master process...
Jan 26 08:46:00 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[213452]: [ALERT]    (213456) : Current worker (213458) exited with code 143 (Terminated)
Jan 26 08:46:00 compute-1 neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7[213452]: [WARNING]  (213456) : All workers exited. Exiting... (0)
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.127 183087 INFO nova.virt.libvirt.driver [-] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Instance destroyed successfully.
Jan 26 08:46:00 compute-1 systemd[1]: libpod-9eb12efaa320ec80d462a95e6dbbb3a96ba65c241d7453e492434d0930f7c917.scope: Deactivated successfully.
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.128 183087 DEBUG nova.objects.instance [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lazy-loading 'resources' on Instance uuid 52d0b676-cf9c-4840-8b66-74ca8b13e2af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:46:00 compute-1 podman[213607]: 2026-01-26 08:46:00.135150037 +0000 UTC m=+0.084270616 container died 9eb12efaa320ec80d462a95e6dbbb3a96ba65c241d7453e492434d0930f7c917 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.144 183087 DEBUG nova.virt.libvirt.vif [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T08:44:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1487231681',display_name='tempest-server-test-1487231681',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1487231681',id=9,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKwpz9wNHHnbJAOOiUppbeyhm9nehnF1Htd8OXU0NdYnRfrosih4iG9UOQxrdGQsm8346olWW9k6G/UQO8gqKcXblbCmZkMf68uZMMkWh2h2m7GZ6H2smfZpLCUdxJAWPg==',key_name='tempest-keypair-test-84578535',keypairs=<?>,launch_index=0,launched_at=2026-01-26T08:44:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='605ef5b310d9405faa10f9c8f78d897f',ramdisk_id='',reservation_id='r-e8sz2804',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkBasicTest-1834146433',owner_user_name='tempest-NetworkBasicTest-1834146433-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T08:45:41Z,user_data=None,user_id='4ba65f88f5c349ff8443da8191c3da2b',uuid=52d0b676-cf9c-4840-8b66-74ca8b13e2af,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.144 183087 DEBUG nova.network.os_vif_util [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Converting VIF {"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.145 183087 DEBUG nova.network.os_vif_util [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:48:29:f7,bridge_name='br-int',has_traffic_filtering=True,id=fe874242-d6f2-4922-8e79-b6545c0e8446,network=Network(fe7e4448-8407-46f2-95a0-344b2f6ecfd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe874242-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.146 183087 DEBUG os_vif [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:48:29:f7,bridge_name='br-int',has_traffic_filtering=True,id=fe874242-d6f2-4922-8e79-b6545c0e8446,network=Network(fe7e4448-8407-46f2-95a0-344b2f6ecfd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe874242-d6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.148 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.148 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe874242-d6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.151 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.154 183087 INFO os_vif [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:48:29:f7,bridge_name='br-int',has_traffic_filtering=True,id=fe874242-d6f2-4922-8e79-b6545c0e8446,network=Network(fe7e4448-8407-46f2-95a0-344b2f6ecfd7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe874242-d6')
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.155 183087 INFO nova.virt.libvirt.driver [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Deleting instance files /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af_del
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.156 183087 INFO nova.virt.libvirt.driver [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Deletion of /var/lib/nova/instances/52d0b676-cf9c-4840-8b66-74ca8b13e2af_del complete
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.169 183087 INFO nova.compute.manager [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Took 0.09 seconds to destroy the instance on the hypervisor.
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.171 183087 DEBUG nova.compute.claims [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Aborting claim: <nova.compute.claims.Claim object at 0x7f6cb87ebdf0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.171 183087 DEBUG oslo_concurrency.lockutils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.172 183087 DEBUG oslo_concurrency.lockutils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:00 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9eb12efaa320ec80d462a95e6dbbb3a96ba65c241d7453e492434d0930f7c917-userdata-shm.mount: Deactivated successfully.
Jan 26 08:46:00 compute-1 systemd[1]: var-lib-containers-storage-overlay-b01e9d8ab6e939c0f90240516d55490f615245ad2f8a2f9ad5a3d4beffcdc510-merged.mount: Deactivated successfully.
Jan 26 08:46:00 compute-1 podman[213607]: 2026-01-26 08:46:00.186588388 +0000 UTC m=+0.135708937 container cleanup 9eb12efaa320ec80d462a95e6dbbb3a96ba65c241d7453e492434d0930f7c917 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 08:46:00 compute-1 systemd[1]: libpod-conmon-9eb12efaa320ec80d462a95e6dbbb3a96ba65c241d7453e492434d0930f7c917.scope: Deactivated successfully.
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.243 183087 INFO nova.compute.manager [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.244 183087 DEBUG oslo.service.loopingcall [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.244 183087 DEBUG nova.compute.manager [-] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.245 183087 DEBUG nova.network.neutron [-] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:46:00 compute-1 podman[213647]: 2026-01-26 08:46:00.283575404 +0000 UTC m=+0.063784203 container remove 9eb12efaa320ec80d462a95e6dbbb3a96ba65c241d7453e492434d0930f7c917 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 08:46:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:00.296 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[5492d9c1-746a-4755-98e3-24a1b79b877e]: (4, ('Mon Jan 26 08:46:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7 (9eb12efaa320ec80d462a95e6dbbb3a96ba65c241d7453e492434d0930f7c917)\n9eb12efaa320ec80d462a95e6dbbb3a96ba65c241d7453e492434d0930f7c917\nMon Jan 26 08:46:00 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7 (9eb12efaa320ec80d462a95e6dbbb3a96ba65c241d7453e492434d0930f7c917)\n9eb12efaa320ec80d462a95e6dbbb3a96ba65c241d7453e492434d0930f7c917\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:00.298 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[f6c74c2f-ee46-4872-a9f0-4c9ca4083057]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:00.300 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe7e4448-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.303 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:00 compute-1 kernel: tapfe7e4448-80: left promiscuous mode
Jan 26 08:46:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:00.321 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[0eec6e47-7e39-45f1-b4cb-a71486abf644]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.321 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:00.348 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[43cdea73-7f93-4ba6-b4cd-3c9d82481cf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:00.350 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[84809a36-b610-49ae-8e82-248eed2442f4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.350 183087 DEBUG nova.compute.provider_tree [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.371 183087 DEBUG nova.scheduler.client.report [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:46:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:00.372 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[7f5c5cb1-5917-405f-9791-d781379e8fda]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348199, 'reachable_time': 35168, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213665, 'error': None, 'target': 'ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:00 compute-1 systemd[1]: run-netns-ovnmeta\x2dfe7e4448\x2d8407\x2d46f2\x2d95a0\x2d344b2f6ecfd7.mount: Deactivated successfully.
Jan 26 08:46:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:00.376 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fe7e4448-8407-46f2-95a0-344b2f6ecfd7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 08:46:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:00.376 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[6c4941d7-4ad6-4500-a8bc-dbafe7cf678b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.402 183087 DEBUG oslo_concurrency.lockutils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.403 183087 DEBUG nova.compute.utils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.405 183087 ERROR nova.compute.manager [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Build of instance 2d7929e0-5231-4515-b513-bde34026aca7 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 2d7929e0-5231-4515-b513-bde34026aca7 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.406 183087 DEBUG nova.compute.manager [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.407 183087 DEBUG nova.virt.libvirt.vif [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:45:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1756212357',display_name='tempest-server-test-1756212357',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-server-test-1756212357',id=15,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMOAo9U7CQ6+MUYlZi/t9zlH/+rAp796cJ8WJs24Medi+Z15tpQDat3ArX4YiDk1KZuy6uq3/eoJDuJSWU3b/ZFp6PXjDw49aWrQKlvfg9FnwF0SFbwb28a5Q4WXO/+Jdg==',key_name='tempest-keypair-test-1873335547',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a3694415e0ac483fa070e7316b146fc1',ramdisk_id='',reservation_id='r-tozr9jmj',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestOvn-2033283083',owner_user_name='tempest-QosTestOvn-2033283083-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:46:00Z,user_data=None,user_id='add713470fcc438f95ec0ff89dbb2adc',uuid=2d7929e0-5231-4515-b513-bde34026aca7,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3", "address": "fa:16:3e:50:b0:4a", "network": {"id": "df9ffb3b-ce9c-4169-b3c6-54f35f96ced1", "bridge": "br-int", "label": "tempest-test-network--156007843", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.221", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3694415e0ac483fa070e7316b146fc1", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa92eb9f-99", "ovs_interfaceid": "fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.407 183087 DEBUG nova.network.os_vif_util [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Converting VIF {"id": "fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3", "address": "fa:16:3e:50:b0:4a", "network": {"id": "df9ffb3b-ce9c-4169-b3c6-54f35f96ced1", "bridge": "br-int", "label": "tempest-test-network--156007843", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.221", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3694415e0ac483fa070e7316b146fc1", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa92eb9f-99", "ovs_interfaceid": "fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.408 183087 DEBUG nova.network.os_vif_util [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:b0:4a,bridge_name='br-int',has_traffic_filtering=True,id=fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3,network=Network(df9ffb3b-ce9c-4169-b3c6-54f35f96ced1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfa92eb9f-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.409 183087 DEBUG os_vif [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:b0:4a,bridge_name='br-int',has_traffic_filtering=True,id=fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3,network=Network(df9ffb3b-ce9c-4169-b3c6-54f35f96ced1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfa92eb9f-99') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.411 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.411 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa92eb9f-99, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.412 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.415 183087 INFO os_vif [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:b0:4a,bridge_name='br-int',has_traffic_filtering=True,id=fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3,network=Network(df9ffb3b-ce9c-4169-b3c6-54f35f96ced1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfa92eb9f-99')
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.415 183087 DEBUG nova.compute.manager [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.416 183087 DEBUG nova.compute.manager [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:46:00 compute-1 nova_compute[183083]: 2026-01-26 08:46:00.416 183087 DEBUG nova.network.neutron [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:46:01 compute-1 nova_compute[183083]: 2026-01-26 08:46:01.900 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:02 compute-1 ovn_controller[95352]: 2026-01-26T08:46:02Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:dd:ab:11 192.168.0.175
Jan 26 08:46:02 compute-1 ovn_controller[95352]: 2026-01-26T08:46:02Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dd:ab:11 192.168.0.175
Jan 26 08:46:02 compute-1 ovn_controller[95352]: 2026-01-26T08:46:02Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:81:c4:17 192.168.1.11
Jan 26 08:46:02 compute-1 ovn_controller[95352]: 2026-01-26T08:46:02Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:81:c4:17 192.168.1.11
Jan 26 08:46:03 compute-1 nova_compute[183083]: 2026-01-26 08:46:03.826 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Updating instance_info_cache with network_info: [{"id": "fe874242-d6f2-4922-8e79-b6545c0e8446", "address": "fa:16:3e:48:29:f7", "network": {"id": "fe7e4448-8407-46f2-95a0-344b2f6ecfd7", "bridge": "br-int", "label": "tempest-test-network--1557689353", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "605ef5b310d9405faa10f9c8f78d897f", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe874242-d6", "ovs_interfaceid": "fe874242-d6f2-4922-8e79-b6545c0e8446", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:03 compute-1 nova_compute[183083]: 2026-01-26 08:46:03.849 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Releasing lock "refresh_cache-52d0b676-cf9c-4840-8b66-74ca8b13e2af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:46:03 compute-1 nova_compute[183083]: 2026-01-26 08:46:03.849 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 08:46:03 compute-1 nova_compute[183083]: 2026-01-26 08:46:03.850 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:46:03 compute-1 nova_compute[183083]: 2026-01-26 08:46:03.850 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:46:03 compute-1 nova_compute[183083]: 2026-01-26 08:46:03.851 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:46:03 compute-1 nova_compute[183083]: 2026-01-26 08:46:03.851 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:46:03 compute-1 nova_compute[183083]: 2026-01-26 08:46:03.851 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:46:03 compute-1 nova_compute[183083]: 2026-01-26 08:46:03.879 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:03 compute-1 nova_compute[183083]: 2026-01-26 08:46:03.879 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:03 compute-1 nova_compute[183083]: 2026-01-26 08:46:03.880 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:03 compute-1 nova_compute[183083]: 2026-01-26 08:46:03.880 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:46:03 compute-1 nova_compute[183083]: 2026-01-26 08:46:03.952 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:46:04 compute-1 podman[213687]: 2026-01-26 08:46:04.016026265 +0000 UTC m=+0.077486903 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.043 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.045 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:46:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:04.086 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}9f525b99de9e60fa5dd5c0db6d28b6208422cec3c2386a10ddef5dc1128c5466" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 26 08:46:04 compute-1 ovn_controller[95352]: 2026-01-26T08:46:04Z|00082|pinctrl|WARN|Dropped 12579 log messages in last 60 seconds (most recently, 2 seconds ago) due to excessive rate
Jan 26 08:46:04 compute-1 ovn_controller[95352]: 2026-01-26T08:46:04Z|00083|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.115 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.374 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.376 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13548MB free_disk=113.0717658996582GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.377 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.377 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.494 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance 52d0b676-cf9c-4840-8b66-74ca8b13e2af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.494 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.540 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance 2d7929e0-5231-4515-b513-bde34026aca7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.541 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.541 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=768MB phys_disk=119GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.546 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.615 183087 DEBUG nova.network.neutron [-] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.642 183087 INFO nova.compute.manager [-] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Took 4.40 seconds to deallocate network for instance.
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.657 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.682 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.706 183087 DEBUG nova.compute.manager [req-0d7b7a44-47aa-461b-8747-b49ba1650c46 req-dd8dfbf7-b138-41b4-a8bb-a3c294340c7f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received event network-vif-unplugged-fe874242-d6f2-4922-8e79-b6545c0e8446 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.706 183087 DEBUG oslo_concurrency.lockutils [req-0d7b7a44-47aa-461b-8747-b49ba1650c46 req-dd8dfbf7-b138-41b4-a8bb-a3c294340c7f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.709 183087 DEBUG oslo_concurrency.lockutils [req-0d7b7a44-47aa-461b-8747-b49ba1650c46 req-dd8dfbf7-b138-41b4-a8bb-a3c294340c7f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.709 183087 DEBUG oslo_concurrency.lockutils [req-0d7b7a44-47aa-461b-8747-b49ba1650c46 req-dd8dfbf7-b138-41b4-a8bb-a3c294340c7f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.710 183087 DEBUG nova.compute.manager [req-0d7b7a44-47aa-461b-8747-b49ba1650c46 req-dd8dfbf7-b138-41b4-a8bb-a3c294340c7f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] No waiting events found dispatching network-vif-unplugged-fe874242-d6f2-4922-8e79-b6545c0e8446 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.710 183087 DEBUG nova.compute.manager [req-0d7b7a44-47aa-461b-8747-b49ba1650c46 req-dd8dfbf7-b138-41b4-a8bb-a3c294340c7f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received event network-vif-unplugged-fe874242-d6f2-4922-8e79-b6545c0e8446 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.711 183087 DEBUG nova.compute.manager [req-0d7b7a44-47aa-461b-8747-b49ba1650c46 req-dd8dfbf7-b138-41b4-a8bb-a3c294340c7f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.711 183087 DEBUG oslo_concurrency.lockutils [req-0d7b7a44-47aa-461b-8747-b49ba1650c46 req-dd8dfbf7-b138-41b4-a8bb-a3c294340c7f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.712 183087 DEBUG oslo_concurrency.lockutils [req-0d7b7a44-47aa-461b-8747-b49ba1650c46 req-dd8dfbf7-b138-41b4-a8bb-a3c294340c7f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.712 183087 DEBUG oslo_concurrency.lockutils [req-0d7b7a44-47aa-461b-8747-b49ba1650c46 req-dd8dfbf7-b138-41b4-a8bb-a3c294340c7f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.713 183087 DEBUG nova.compute.manager [req-0d7b7a44-47aa-461b-8747-b49ba1650c46 req-dd8dfbf7-b138-41b4-a8bb-a3c294340c7f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] No waiting events found dispatching network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.713 183087 WARNING nova.compute.manager [req-0d7b7a44-47aa-461b-8747-b49ba1650c46 req-dd8dfbf7-b138-41b4-a8bb-a3c294340c7f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received unexpected event network-vif-plugged-fe874242-d6f2-4922-8e79-b6545c0e8446 for instance with vm_state active and task_state deleting.
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.728 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.728 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.351s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.732 183087 DEBUG oslo_concurrency.lockutils [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.732 183087 DEBUG oslo_concurrency.lockutils [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.826 183087 DEBUG nova.compute.provider_tree [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.842 183087 DEBUG nova.scheduler.client.report [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.870 183087 DEBUG oslo_concurrency.lockutils [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.966 183087 DEBUG nova.network.neutron [req-5d0a3321-eeb0-40e5-839f-5975853a17b1 req-6ffc3402-8216-4c49-9947-e32605849340 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Updated VIF entry in instance network info cache for port fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.967 183087 DEBUG nova.network.neutron [req-5d0a3321-eeb0-40e5-839f-5975853a17b1 req-6ffc3402-8216-4c49-9947-e32605849340 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Updating instance_info_cache with network_info: [{"id": "fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3", "address": "fa:16:3e:50:b0:4a", "network": {"id": "df9ffb3b-ce9c-4169-b3c6-54f35f96ced1", "bridge": "br-int", "label": "tempest-test-network--156007843", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.221", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3694415e0ac483fa070e7316b146fc1", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa92eb9f-99", "ovs_interfaceid": "fa92eb9f-99e0-4d4c-b3c5-1a7a395d05b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.991 183087 INFO nova.scheduler.client.report [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Deleted allocations for instance 52d0b676-cf9c-4840-8b66-74ca8b13e2af
Jan 26 08:46:04 compute-1 nova_compute[183083]: 2026-01-26 08:46:04.995 183087 DEBUG oslo_concurrency.lockutils [req-5d0a3321-eeb0-40e5-839f-5975853a17b1 req-6ffc3402-8216-4c49-9947-e32605849340 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-2d7929e0-5231-4515-b513-bde34026aca7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:46:05 compute-1 nova_compute[183083]: 2026-01-26 08:46:05.068 183087 DEBUG oslo_concurrency.lockutils [None req-a9d234eb-e33a-420b-8841-5e2406329874 4ba65f88f5c349ff8443da8191c3da2b 605ef5b310d9405faa10f9c8f78d897f - - default default] Lock "52d0b676-cf9c-4840-8b66-74ca8b13e2af" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:05 compute-1 nova_compute[183083]: 2026-01-26 08:46:05.152 183087 INFO nova.compute.manager [None req-54e944d6-603a-47d1-aeb1-c07c4df804ef 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Get console output
Jan 26 08:46:05 compute-1 nova_compute[183083]: 2026-01-26 08:46:05.153 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:05 compute-1 nova_compute[183083]: 2026-01-26 08:46:05.161 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:46:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:05.297 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.297 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 972 Content-Type: application/json Date: Mon, 26 Jan 2026 08:46:04 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-59635133-1dc5-4184-9579-b4faf465c955 x-openstack-request-id: req-59635133-1dc5-4184-9579-b4faf465c955 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.298 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "22222222-2222-2222-2222-222222222222", "name": "custom_neutron_guest", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/22222222-2222-2222-2222-222222222222"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/22222222-2222-2222-2222-222222222222"}]}, {"id": "5de535f4-b972-4630-b363-674a13f51d7a", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/5de535f4-b972-4630-b363-674a13f51d7a"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/5de535f4-b972-4630-b363-674a13f51d7a"}]}, {"id": "a7017323-cd35-470d-a718-faa6c6e97277", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/a7017323-cd35-470d-a718-faa6c6e97277"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/a7017323-cd35-470d-a718-faa6c6e97277"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.298 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-59635133-1dc5-4184-9579-b4faf465c955 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 26 08:46:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:05.298 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:05.300 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.302 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/a7017323-cd35-470d-a718-faa6c6e97277 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}9f525b99de9e60fa5dd5c0db6d28b6208422cec3c2386a10ddef5dc1128c5466" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 26 08:46:05 compute-1 nova_compute[183083]: 2026-01-26 08:46:05.398 183087 DEBUG nova.compute.manager [req-4112d479-fba3-408c-8ab8-2a1bca62c9b5 req-5aa6d9c1-c29b-4540-9be6-1c1383ca9605 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Received event network-vif-deleted-fe874242-d6f2-4922-8e79-b6545c0e8446 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:46:05 compute-1 nova_compute[183083]: 2026-01-26 08:46:05.469 183087 DEBUG nova.network.neutron [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:05 compute-1 nova_compute[183083]: 2026-01-26 08:46:05.499 183087 INFO nova.compute.manager [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: 2d7929e0-5231-4515-b513-bde34026aca7] Took 5.08 seconds to deallocate network for instance.
Jan 26 08:46:05 compute-1 ovn_controller[95352]: 2026-01-26T08:46:05Z|00084|binding|INFO|Releasing lport 809259ab-8ea4-4909-92b4-4ee536a51482 from this chassis (sb_readonly=0)
Jan 26 08:46:05 compute-1 ovn_controller[95352]: 2026-01-26T08:46:05Z|00085|binding|INFO|Releasing lport bfb9744a-58f0-4145-9e28-9c13225b3407 from this chassis (sb_readonly=0)
Jan 26 08:46:05 compute-1 nova_compute[183083]: 2026-01-26 08:46:05.584 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:05 compute-1 nova_compute[183083]: 2026-01-26 08:46:05.645 183087 INFO nova.scheduler.client.report [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Deleted allocations for instance 2d7929e0-5231-4515-b513-bde34026aca7
Jan 26 08:46:05 compute-1 nova_compute[183083]: 2026-01-26 08:46:05.645 183087 DEBUG oslo_concurrency.lockutils [None req-1186f0c4-8d31-4c2d-8572-907695cdaab8 add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "2d7929e0-5231-4515-b513-bde34026aca7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.515s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.754 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Mon, 26 Jan 2026 08:46:05 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-81e373d9-1016-457a-ab56-65f8486c7bd0 x-openstack-request-id: req-81e373d9-1016-457a-ab56-65f8486c7bd0 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.754 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "a7017323-cd35-470d-a718-faa6c6e97277", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/a7017323-cd35-470d-a718-faa6c6e97277"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/a7017323-cd35-470d-a718-faa6c6e97277"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.754 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/a7017323-cd35-470d-a718-faa6c6e97277 used request id req-81e373d9-1016-457a-ab56-65f8486c7bd0 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.756 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'name': 'tempest-server-test-1154658508', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000b', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '4a559c36b13649d98b2995c099340eb9', 'user_id': '52d582094c584036ba3e04c9da69ee02', 'hostId': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.756 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.759 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8 / tap30b45c93-b3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.760 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8 / tap71bea873-e0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.760 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.outgoing.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.761 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.outgoing.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '838d49b3-a081-4457-a41c-aafe0f65f14f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap30b45c93-b3', 'timestamp': '2026-01-26T08:46:05.756801', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap30b45c93-b3', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:ab:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap30b45c93-b3'}, 'message_id': '73c06e78-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': '8e22601e708e79a19b48b480d6adf694ef003c3effbb954cbede55d7f770a5af'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap71bea873-e0', 'timestamp': '2026-01-26T08:46:05.756801', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap71bea873-e0', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71bea873-e0'}, 'message_id': '73c07d28-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': '4e5763c69498edbc5c33ccda111a0d6b3099f09a959289b4aa39fd133153ac7a'}]}, 'timestamp': '2026-01-26 08:46:05.761838', '_unique_id': '377ab6d28ada4bffb00453c4f3e857e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.766 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.768 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.768 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.768 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1154658508>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1154658508>]
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.769 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.769 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.769 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1154658508>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1154658508>]
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.769 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.769 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.769 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21de1ead-5274-4a2c-bd60-878210ef9d22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap30b45c93-b3', 'timestamp': '2026-01-26T08:46:05.769566', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap30b45c93-b3', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:ab:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap30b45c93-b3'}, 'message_id': '73c1b6e8-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': 'c4ea2e518c64f3c672133fec70522d38632d84ae3ddca818bbec923286e1d9b9'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap71bea873-e0', 'timestamp': '2026-01-26T08:46:05.769566', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap71bea873-e0', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71bea873-e0'}, 'message_id': '73c1c11a-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': 'b52a681298eccbb3ef3c2099245521a0485a28f25b75caa47c3623ddedf4a1af'}]}, 'timestamp': '2026-01-26 08:46:05.770070', '_unique_id': 'cbbb1cf9bfad448b96c4a563679f125b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.770 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.771 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.771 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.771 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5dc742dc-d520-482b-a306-2c4adfda873d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap30b45c93-b3', 'timestamp': '2026-01-26T08:46:05.771334', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap30b45c93-b3', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:ab:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap30b45c93-b3'}, 'message_id': '73c1fb4e-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': '6cd182f854bb1592ca8187772592dbf571c6ded275128d778a9eb9844e2bd234'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap71bea873-e0', 'timestamp': '2026-01-26T08:46:05.771334', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap71bea873-e0', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71bea873-e0'}, 'message_id': '73c20382-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': 'e0d76904bbe33ef8492f409b3c6bd93eb3e183c5e54e5f31ace1248b9641e7e6'}]}, 'timestamp': '2026-01-26 08:46:05.771764', '_unique_id': '85c28d1fcfd847a585f8836050ee4c86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.772 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36131d0f-68e5-4fea-a8fd-5f8f5e6d654a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap30b45c93-b3', 'timestamp': '2026-01-26T08:46:05.772890', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap30b45c93-b3', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:ab:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap30b45c93-b3'}, 'message_id': '73c237f8-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': '784e81ce15e502d24047aa0dc010a4b0549986a1caa479c9a390d1218b4cc412'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap71bea873-e0', 'timestamp': '2026-01-26T08:46:05.772890', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap71bea873-e0', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71bea873-e0'}, 'message_id': '73c24108-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': '61b9978480f96727ff472f34c3bc15623a309bfecc49d0cb3f936aa9239e7a17'}]}, 'timestamp': '2026-01-26 08:46:05.773340', '_unique_id': '2460b3443584467fad424461c479b830'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.773 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.774 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.789 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/memory.usage volume: 40.48046875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f477bfd8-cd14-44bf-9c11-dae8527b6f89', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.48046875, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'timestamp': '2026-01-26T08:46:05.774468', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '73c4cafe-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.451333472, 'message_signature': '97c6c8790eb2aed297c464a858da50aaf2dd8522aea4178f41d81813b005b5cd'}]}, 'timestamp': '2026-01-26 08:46:05.790025', '_unique_id': '0dda2f90f96f48bfbcb2370d90aa18e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.790 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.791 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.819 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.device.read.bytes volume: 29415936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.820 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46135045-3073-4787-9c93-e8de16e63cab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29415936, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-vda', 'timestamp': '2026-01-26T08:46:05.791820', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '73c964ba-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.453870544, 'message_signature': 'fffb9b2e782ca291574aed3dea933547b8a3f389eadb679c94b7dfc61ed24a0a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-sda', 'timestamp': '2026-01-26T08:46:05.791820', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '73c96faa-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.453870544, 'message_signature': '3c75d21e0e7b3cd5b58913b8b2bf1bc948c9807f08c49588019f25a1dc1e5327'}]}, 'timestamp': '2026-01-26 08:46:05.820413', '_unique_id': '367605aeb5db443ba1a74ff1b1235b6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.821 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.device.read.latency volume: 218062505 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.device.read.latency volume: 22252574 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b964d480-7256-47e3-8653-1254952db2e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 218062505, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-vda', 'timestamp': '2026-01-26T08:46:05.821967', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '73c9b5b4-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.453870544, 'message_signature': '6e305ecf350edb688f6ec19bdf5a246f14d5dadea41e9c0db4968b56c1eb822a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22252574, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-sda', 'timestamp': '2026-01-26T08:46:05.821967', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '73c9bdb6-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.453870544, 'message_signature': '1533186022f3ceeaa1427541130da194896d2091a71e18619b0361ea0c2ec992'}]}, 'timestamp': '2026-01-26 08:46:05.822397', '_unique_id': '08c6f446a2b24d7e99f574390efb6795'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.822 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.823 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.823 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.device.write.requests volume: 308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.823 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9cc8e64-76ea-4669-9f4d-94ae50d1012b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 308, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-vda', 'timestamp': '2026-01-26T08:46:05.823662', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '73c9f86c-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.453870544, 'message_signature': '1560aa7ad970094cbcd22e32e5dd37e14956002c5bed6d835c24cf2f5e892ece'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-sda', 'timestamp': '2026-01-26T08:46:05.823662', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '73ca026c-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.453870544, 'message_signature': '778af684bc33beb94b26e08b9ef02794db3ff15130ad79f748b68e8b3fce5931'}]}, 'timestamp': '2026-01-26 08:46:05.824177', '_unique_id': '02918bffa9db464292da4fcaf0a23b2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.824 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.825 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.825 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.825 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24bad9a3-cb57-4099-a8b3-fccdce25feaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap30b45c93-b3', 'timestamp': '2026-01-26T08:46:05.825269', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap30b45c93-b3', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:ab:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap30b45c93-b3'}, 'message_id': '73ca3624-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': '8ea1cc82516ff6678c18b20644e7706e0e461dc7222143f55af3f139ec6b84cc'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap71bea873-e0', 'timestamp': '2026-01-26T08:46:05.825269', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap71bea873-e0', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71bea873-e0'}, 'message_id': '73ca3e08-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': '09341682dea1c5d756bc5b06e7d4ca05b4259dbb1ef874bb92fe60aff4973d23'}]}, 'timestamp': '2026-01-26 08:46:05.825710', '_unique_id': 'b0f5b0ef9bd3419885027679bdf6561f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.826 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1154658508>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1154658508>]
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.827 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.827 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.827 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4ea6ef8-a310-4116-ab39-4661936319aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap30b45c93-b3', 'timestamp': '2026-01-26T08:46:05.827063', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap30b45c93-b3', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:ab:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap30b45c93-b3'}, 'message_id': '73ca7cba-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': '0c8eb4ffad577892edd3102e0a0f13c6b53e8d8a7d46f9ca9e11a10512e0fef3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap71bea873-e0', 'timestamp': '2026-01-26T08:46:05.827063', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap71bea873-e0', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71bea873-e0'}, 'message_id': '73ca84f8-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': '49331b69bd5d0fbde3c2258bc78ef47d1ec21701c13665de6ccb52350ae9f59e'}]}, 'timestamp': '2026-01-26 08:46:05.827523', '_unique_id': '805d89bb42bd4c73a019a21c51e47f4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.828 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.843 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.844 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cae973a9-635a-4f5f-a22a-317a8ac63089', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-vda', 'timestamp': '2026-01-26T08:46:05.828795', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '73cd172c-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.490882605, 'message_signature': 'c4f6e37da8e70bf055836565c098804f7e487e48904f8d036b65e1ac4c7760bd'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-sda', 'timestamp': '2026-01-26T08:46:05.828795', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '73cd2a82-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.490882605, 'message_signature': '422b5873150986e5b6452441030a665bd05cdfe02141b1cfc313a21fdaaee5d9'}]}, 'timestamp': '2026-01-26 08:46:05.844962', '_unique_id': '702ab0de6ecc454f83fcc21a6d648a82'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.846 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.847 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.847 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.847 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a33028d-018d-4d66-b585-4ce49079d95d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-vda', 'timestamp': '2026-01-26T08:46:05.847522', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '73cda002-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.490882605, 'message_signature': '83b4d28e05ccbef3d574fc3924b68dc4623bc38af046dac011a33ae735847085'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-sda', 'timestamp': '2026-01-26T08:46:05.847522', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '73cdb3b2-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.490882605, 'message_signature': 'f469f2f423d216097a45f73c21aa1c1e99c392d3eaeca746905ba5684647353a'}]}, 'timestamp': '2026-01-26 08:46:05.848503', '_unique_id': '36b902aa3e914bfeab0e460e79083004'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.849 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.850 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.850 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.851 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3aa02e19-0c74-4639-a424-036b3a72a18e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap30b45c93-b3', 'timestamp': '2026-01-26T08:46:05.850856', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap30b45c93-b3', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:ab:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap30b45c93-b3'}, 'message_id': '73ce246e-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': 'e13cc5a4768278b4e2e1fd3ea62aeaa663d6f89cf5c060ed7dd9419421ed60b9'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap71bea873-e0', 'timestamp': '2026-01-26T08:46:05.850856', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap71bea873-e0', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71bea873-e0'}, 'message_id': '73ce3896-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': 'b46f4f5a134f82a1328dc17b634a614d665b694a47efce12a9b9df7d6439d5b7'}]}, 'timestamp': '2026-01-26 08:46:05.851889', '_unique_id': 'a538224027024f6c90d00f7575677130'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.852 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.854 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.854 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.device.write.latency volume: 2299812540 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.854 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b98650ed-ce8e-4da8-9f42-e7d4770ea27e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2299812540, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-vda', 'timestamp': '2026-01-26T08:46:05.854353', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '73ceab28-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.453870544, 'message_signature': '8be1d928e1d7ab0615305e556cf4dc51881d33166caea6ab53c7f4311eba908d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-sda', 'timestamp': '2026-01-26T08:46:05.854353', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '73cebc62-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.453870544, 'message_signature': '77764d5e77b4725158df652ccc72228a4630358ade331026f07c86cf4e0fb09a'}]}, 'timestamp': '2026-01-26 08:46:05.855288', '_unique_id': '86b5c8fcd61d48d2b02c7471c05982f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.856 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.857 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.857 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.outgoing.bytes volume: 1326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.858 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.outgoing.bytes volume: 1374 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a633e249-e030-4b2d-b73c-8fb534b64b11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1326, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap30b45c93-b3', 'timestamp': '2026-01-26T08:46:05.857757', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap30b45c93-b3', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:ab:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap30b45c93-b3'}, 'message_id': '73cf3002-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': 'bf4ba457152226f08cc436ee3927a5002a8d4660747479ede9d85b13adbaf672'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1374, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap71bea873-e0', 'timestamp': '2026-01-26T08:46:05.857757', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap71bea873-e0', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71bea873-e0'}, 'message_id': '73cf42ea-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': 'd2b33d4383573bba5cde34b90c2bae9811b9aefcce1f856f07ceb07f5d69de6e'}]}, 'timestamp': '2026-01-26 08:46:05.858699', '_unique_id': 'e5277087c44f4bcb8e3d847a313e3fe7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.859 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.860 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.861 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/cpu volume: 11300000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba74f183-d711-453a-aed6-b93dd9e67032', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11300000000, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'timestamp': '2026-01-26T08:46:05.861043', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '73cfb194-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.451333472, 'message_signature': 'db72855af0955eb7d0cf1c0b6b878a40434559850603c9765a3a7f0633ea5452'}]}, 'timestamp': '2026-01-26 08:46:05.861533', '_unique_id': '159abfb92a654020b3bb5a17119b470c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.862 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.863 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.863 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.device.read.requests volume: 1063 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.864 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40e7853a-3dce-407b-a76a-7086e363b6f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1063, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-vda', 'timestamp': '2026-01-26T08:46:05.863955', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '73d02322-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.453870544, 'message_signature': '4aa57ce0f5846fbf8e1052f6e6b6d6f2857c50b81cdc2b06bc8230ea0de7f6a3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-sda', 'timestamp': '2026-01-26T08:46:05.863955', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '73d034c0-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.453870544, 'message_signature': 'dc1f05058c1f2696b091750a1cb441812c31ee1bca35fd493576e836aa30242c'}]}, 'timestamp': '2026-01-26 08:46:05.864875', '_unique_id': 'abeed81d988c45efbc7f3de9072537ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.866 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.867 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.867 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.868 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1154658508>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1154658508>]
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.868 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.868 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.incoming.bytes volume: 1262 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.869 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.incoming.bytes volume: 1240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5237033f-4f6b-4f8a-8e6e-7022e0b91eba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1262, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap30b45c93-b3', 'timestamp': '2026-01-26T08:46:05.868684', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap30b45c93-b3', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:ab:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap30b45c93-b3'}, 'message_id': '73d0dbbe-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': '4430c69e39ec53049622c031c26c454525b6bc96dc3f5bcc95e9c29908132af1'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1240, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap71bea873-e0', 'timestamp': '2026-01-26T08:46:05.868684', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap71bea873-e0', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71bea873-e0'}, 'message_id': '73d0eb22-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': '88adfbd6f341393ec097856509100cea73f4c1b15815538957409cedee3fa3cb'}]}, 'timestamp': '2026-01-26 08:46:05.869493', '_unique_id': 'c70b4de38c8744d1b56b9d397605fd7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.870 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.871 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.871 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.device.write.bytes volume: 72769536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.871 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ff760c5-76fc-4799-a7f5-197040ce316b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72769536, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-vda', 'timestamp': '2026-01-26T08:46:05.871127', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '73d1374e-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.453870544, 'message_signature': 'ca739b165b601a84329420a242533078029d5bebf4c4a88379dd6330143c7be2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-sda', 'timestamp': '2026-01-26T08:46:05.871127', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '73d1423e-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.453870544, 'message_signature': '2b972c76f23715e85448e3578054e3130ea264b68ee0598b5cb7a7a550a119e4'}]}, 'timestamp': '2026-01-26 08:46:05.871708', '_unique_id': 'b5b959786db14942a727cb64bd7a85ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.872 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.873 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.873 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.873 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd036cac9-3d61-4933-9bd4-f6a5905cdc62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-vda', 'timestamp': '2026-01-26T08:46:05.873358', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '73d18eb0-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.490882605, 'message_signature': '778133d9dc64763faabe784533871c00be0e11ffef74e1f96e48222f862e773c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-sda', 'timestamp': '2026-01-26T08:46:05.873358', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'instance-0000000b', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '73d199aa-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.490882605, 'message_signature': '990157fca27173f67065af4e95d7c913f5060228d2d335e510d67fe8f9b76ada'}]}, 'timestamp': '2026-01-26 08:46:05.873944', '_unique_id': '872bce9c5c704f8d986c6caf52a684a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.874 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.875 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.875 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.876 12 DEBUG ceilometer.compute.pollsters [-] 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8190df7-4980-4847-9ef7-edb200180a84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap30b45c93-b3', 'timestamp': '2026-01-26T08:46:05.875694', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap30b45c93-b3', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:ab:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap30b45c93-b3'}, 'message_id': '73d1e9c8-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': 'e8596f1c47937ebde693dc1897c89b8c30b79da4e20446893d23a059a8739f9f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-0000000b-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-tap71bea873-e0', 'timestamp': '2026-01-26T08:46:05.875694', 'resource_metadata': {'display_name': 'tempest-server-test-1154658508', 'name': 'tap71bea873-e0', 'instance_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71bea873-e0'}, 'message_id': '73d1f6fc-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3506.418862419, 'message_signature': '8bfd9f1d7f67043a39251ccfcdbc41d59adaf7838f74ffe28b53003e5f612ac9'}]}, 'timestamp': '2026-01-26 08:46:05.876360', '_unique_id': '33de8814add64ed8aa389e73d875a31d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:46:05 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:46:05.877 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:46:06 compute-1 ovn_controller[95352]: 2026-01-26T08:46:06Z|00086|binding|INFO|Releasing lport 809259ab-8ea4-4909-92b4-4ee536a51482 from this chassis (sb_readonly=0)
Jan 26 08:46:06 compute-1 ovn_controller[95352]: 2026-01-26T08:46:06Z|00087|binding|INFO|Releasing lport bfb9744a-58f0-4145-9e28-9c13225b3407 from this chassis (sb_readonly=0)
Jan 26 08:46:06 compute-1 nova_compute[183083]: 2026-01-26 08:46:06.147 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:07 compute-1 nova_compute[183083]: 2026-01-26 08:46:07.587 183087 DEBUG oslo_concurrency.lockutils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquiring lock "9001392f-3418-47ad-916e-500f88865c32" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:07 compute-1 nova_compute[183083]: 2026-01-26 08:46:07.588 183087 DEBUG oslo_concurrency.lockutils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "9001392f-3418-47ad-916e-500f88865c32" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:07 compute-1 nova_compute[183083]: 2026-01-26 08:46:07.617 183087 DEBUG nova.compute.manager [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:46:07 compute-1 nova_compute[183083]: 2026-01-26 08:46:07.688 183087 DEBUG oslo_concurrency.lockutils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:07 compute-1 nova_compute[183083]: 2026-01-26 08:46:07.689 183087 DEBUG oslo_concurrency.lockutils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:07 compute-1 nova_compute[183083]: 2026-01-26 08:46:07.696 183087 DEBUG nova.virt.hardware [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:46:07 compute-1 nova_compute[183083]: 2026-01-26 08:46:07.697 183087 INFO nova.compute.claims [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:46:07 compute-1 nova_compute[183083]: 2026-01-26 08:46:07.857 183087 DEBUG nova.compute.provider_tree [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:46:07 compute-1 nova_compute[183083]: 2026-01-26 08:46:07.882 183087 DEBUG nova.scheduler.client.report [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:46:07 compute-1 nova_compute[183083]: 2026-01-26 08:46:07.934 183087 DEBUG oslo_concurrency.lockutils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:07 compute-1 nova_compute[183083]: 2026-01-26 08:46:07.935 183087 DEBUG nova.compute.manager [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:46:07 compute-1 nova_compute[183083]: 2026-01-26 08:46:07.984 183087 DEBUG nova.compute.manager [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:46:07 compute-1 nova_compute[183083]: 2026-01-26 08:46:07.984 183087 DEBUG nova.network.neutron [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:46:08 compute-1 nova_compute[183083]: 2026-01-26 08:46:08.003 183087 INFO nova.virt.libvirt.driver [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:46:08 compute-1 nova_compute[183083]: 2026-01-26 08:46:08.024 183087 DEBUG nova.compute.manager [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:46:08 compute-1 nova_compute[183083]: 2026-01-26 08:46:08.117 183087 DEBUG nova.compute.manager [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:46:08 compute-1 nova_compute[183083]: 2026-01-26 08:46:08.118 183087 DEBUG nova.virt.libvirt.driver [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:46:08 compute-1 nova_compute[183083]: 2026-01-26 08:46:08.119 183087 INFO nova.virt.libvirt.driver [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Creating image(s)
Jan 26 08:46:08 compute-1 nova_compute[183083]: 2026-01-26 08:46:08.119 183087 DEBUG oslo_concurrency.lockutils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquiring lock "/var/lib/nova/instances/9001392f-3418-47ad-916e-500f88865c32/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:08 compute-1 nova_compute[183083]: 2026-01-26 08:46:08.119 183087 DEBUG oslo_concurrency.lockutils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "/var/lib/nova/instances/9001392f-3418-47ad-916e-500f88865c32/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:08 compute-1 nova_compute[183083]: 2026-01-26 08:46:08.120 183087 DEBUG oslo_concurrency.lockutils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "/var/lib/nova/instances/9001392f-3418-47ad-916e-500f88865c32/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:08 compute-1 nova_compute[183083]: 2026-01-26 08:46:08.121 183087 DEBUG oslo_concurrency.lockutils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:08 compute-1 nova_compute[183083]: 2026-01-26 08:46:08.121 183087 DEBUG oslo_concurrency.lockutils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.062 183087 DEBUG oslo_concurrency.lockutils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.941s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32] Traceback (most recent call last):
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]     raise exception.ImageUnacceptable(
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32] 
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32] During handling of the above exception, another exception occurred:
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32] 
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32] Traceback (most recent call last):
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]     yield resources
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]     created_disks = self._create_and_inject_local_root(
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]     image.cache(fetch_func=fetch_func,
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]     return f(*args, **kwargs)
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32]     raise exception.ImageUnacceptable(
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.063 183087 ERROR nova.compute.manager [instance: 9001392f-3418-47ad-916e-500f88865c32] 
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.085 183087 DEBUG nova.policy [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '41a09f4d7f034b1c85f20c9512d33411', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71cced1777f24868932d789154ff04a0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.223 183087 DEBUG oslo_concurrency.lockutils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "1971a833-1317-47c9-8128-d849f8397edf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.224 183087 DEBUG oslo_concurrency.lockutils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "1971a833-1317-47c9-8128-d849f8397edf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.246 183087 DEBUG nova.compute.manager [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.324 183087 DEBUG oslo_concurrency.lockutils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.325 183087 DEBUG oslo_concurrency.lockutils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.332 183087 DEBUG nova.virt.hardware [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.333 183087 INFO nova.compute.claims [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.481 183087 DEBUG nova.compute.provider_tree [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.499 183087 DEBUG nova.scheduler.client.report [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.540 183087 DEBUG oslo_concurrency.lockutils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.540 183087 DEBUG nova.compute.manager [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.544 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.563 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:09 compute-1 NetworkManager[55451]: <info>  [1769417169.5641] manager: (patch-provnet-149e76db-406a-40c9-b6a7-879b1da420de-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Jan 26 08:46:09 compute-1 NetworkManager[55451]: <info>  [1769417169.5650] manager: (patch-br-int-to-provnet-149e76db-406a-40c9-b6a7-879b1da420de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.603 183087 DEBUG nova.compute.manager [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.604 183087 DEBUG nova.network.neutron [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.637 183087 INFO nova.virt.libvirt.driver [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.656 183087 DEBUG nova.compute.manager [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.751 183087 DEBUG nova.compute.manager [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.753 183087 DEBUG nova.virt.libvirt.driver [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.754 183087 INFO nova.virt.libvirt.driver [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Creating image(s)
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.755 183087 DEBUG oslo_concurrency.lockutils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "/var/lib/nova/instances/1971a833-1317-47c9-8128-d849f8397edf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.756 183087 DEBUG oslo_concurrency.lockutils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "/var/lib/nova/instances/1971a833-1317-47c9-8128-d849f8397edf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.757 183087 DEBUG oslo_concurrency.lockutils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "/var/lib/nova/instances/1971a833-1317-47c9-8128-d849f8397edf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.757 183087 DEBUG oslo_concurrency.lockutils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.758 183087 DEBUG oslo_concurrency.lockutils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:09 compute-1 nova_compute[183083]: 2026-01-26 08:46:09.989 183087 DEBUG nova.policy [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a7abeebb4e4d469c91e6cee77f6be1c3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b71ae2b9d2fd454b8b3b9aa1a0e5c7e4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.047 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:10 compute-1 ovn_controller[95352]: 2026-01-26T08:46:10Z|00088|binding|INFO|Releasing lport 809259ab-8ea4-4909-92b4-4ee536a51482 from this chassis (sb_readonly=0)
Jan 26 08:46:10 compute-1 ovn_controller[95352]: 2026-01-26T08:46:10Z|00089|binding|INFO|Releasing lport bfb9744a-58f0-4145-9e28-9c13225b3407 from this chassis (sb_readonly=0)
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.140 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.151 183087 DEBUG nova.compute.manager [req-b0d9df2a-9235-4525-86b9-029378f907ed req-8027f4f4-9f74-4715-9595-9e204394d830 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Received event network-changed-30b45c93-b3bb-44e6-8e4a-6903a631c773 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.151 183087 DEBUG nova.compute.manager [req-b0d9df2a-9235-4525-86b9-029378f907ed req-8027f4f4-9f74-4715-9595-9e204394d830 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Refreshing instance network info cache due to event network-changed-30b45c93-b3bb-44e6-8e4a-6903a631c773. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.152 183087 DEBUG oslo_concurrency.lockutils [req-b0d9df2a-9235-4525-86b9-029378f907ed req-8027f4f4-9f74-4715-9595-9e204394d830 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.152 183087 DEBUG oslo_concurrency.lockutils [req-b0d9df2a-9235-4525-86b9-029378f907ed req-8027f4f4-9f74-4715-9595-9e204394d830 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.152 183087 DEBUG nova.network.neutron [req-b0d9df2a-9235-4525-86b9-029378f907ed req-8027f4f4-9f74-4715-9595-9e204394d830 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Refreshing network info cache for port 30b45c93-b3bb-44e6-8e4a-6903a631c773 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.155 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.763 183087 DEBUG oslo_concurrency.lockutils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf] Traceback (most recent call last):
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]     raise exception.ImageUnacceptable(
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf] 
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf] During handling of the above exception, another exception occurred:
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf] 
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf] Traceback (most recent call last):
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]     yield resources
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]     created_disks = self._create_and_inject_local_root(
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]     image.cache(fetch_func=fetch_func,
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]     return f(*args, **kwargs)
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf]     raise exception.ImageUnacceptable(
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:46:10 compute-1 nova_compute[183083]: 2026-01-26 08:46:10.764 183087 ERROR nova.compute.manager [instance: 1971a833-1317-47c9-8128-d849f8397edf] 
Jan 26 08:46:11 compute-1 nova_compute[183083]: 2026-01-26 08:46:11.171 183087 DEBUG nova.network.neutron [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Successfully updated port: dd68ce69-6136-4685-b76e-3d9ca72d5f1e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:46:11 compute-1 nova_compute[183083]: 2026-01-26 08:46:11.194 183087 DEBUG oslo_concurrency.lockutils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquiring lock "refresh_cache-9001392f-3418-47ad-916e-500f88865c32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:46:11 compute-1 nova_compute[183083]: 2026-01-26 08:46:11.194 183087 DEBUG oslo_concurrency.lockutils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquired lock "refresh_cache-9001392f-3418-47ad-916e-500f88865c32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:46:11 compute-1 nova_compute[183083]: 2026-01-26 08:46:11.195 183087 DEBUG nova.network.neutron [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:46:11 compute-1 nova_compute[183083]: 2026-01-26 08:46:11.355 183087 DEBUG nova.compute.manager [req-d126b17f-fa00-4003-8104-ab7029379b63 req-7d351a5d-6745-4362-b1e8-bfd9e6234c8a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Received event network-changed-dd68ce69-6136-4685-b76e-3d9ca72d5f1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:46:11 compute-1 nova_compute[183083]: 2026-01-26 08:46:11.356 183087 DEBUG nova.compute.manager [req-d126b17f-fa00-4003-8104-ab7029379b63 req-7d351a5d-6745-4362-b1e8-bfd9e6234c8a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Refreshing instance network info cache due to event network-changed-dd68ce69-6136-4685-b76e-3d9ca72d5f1e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:46:11 compute-1 nova_compute[183083]: 2026-01-26 08:46:11.356 183087 DEBUG oslo_concurrency.lockutils [req-d126b17f-fa00-4003-8104-ab7029379b63 req-7d351a5d-6745-4362-b1e8-bfd9e6234c8a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-9001392f-3418-47ad-916e-500f88865c32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:46:11 compute-1 nova_compute[183083]: 2026-01-26 08:46:11.464 183087 DEBUG nova.network.neutron [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:46:11 compute-1 nova_compute[183083]: 2026-01-26 08:46:11.767 183087 DEBUG nova.network.neutron [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Successfully created port: b537c620-03b7-460a-a88d-1ea88e276912 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.629 183087 DEBUG nova.network.neutron [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Updating instance_info_cache with network_info: [{"id": "dd68ce69-6136-4685-b76e-3d9ca72d5f1e", "address": "fa:16:3e:33:89:a6", "network": {"id": "025bb731-9b63-4a5d-b85e-b1832de66a48", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.4.0/24", "dns": [], "gateway": {"address": "192.168.4.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.4.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd68ce69-61", "ovs_interfaceid": "dd68ce69-6136-4685-b76e-3d9ca72d5f1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.666 183087 DEBUG oslo_concurrency.lockutils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Releasing lock "refresh_cache-9001392f-3418-47ad-916e-500f88865c32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.667 183087 DEBUG nova.compute.manager [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Instance network_info: |[{"id": "dd68ce69-6136-4685-b76e-3d9ca72d5f1e", "address": "fa:16:3e:33:89:a6", "network": {"id": "025bb731-9b63-4a5d-b85e-b1832de66a48", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.4.0/24", "dns": [], "gateway": {"address": "192.168.4.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.4.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd68ce69-61", "ovs_interfaceid": "dd68ce69-6136-4685-b76e-3d9ca72d5f1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.668 183087 DEBUG oslo_concurrency.lockutils [req-d126b17f-fa00-4003-8104-ab7029379b63 req-7d351a5d-6745-4362-b1e8-bfd9e6234c8a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-9001392f-3418-47ad-916e-500f88865c32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.669 183087 DEBUG nova.network.neutron [req-d126b17f-fa00-4003-8104-ab7029379b63 req-7d351a5d-6745-4362-b1e8-bfd9e6234c8a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Refreshing network info cache for port dd68ce69-6136-4685-b76e-3d9ca72d5f1e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.671 183087 INFO nova.compute.manager [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Terminating instance
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.673 183087 DEBUG nova.compute.manager [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.680 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 9001392f-3418-47ad-916e-500f88865c32] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.680 183087 INFO nova.virt.libvirt.driver [-] [instance: 9001392f-3418-47ad-916e-500f88865c32] Instance destroyed successfully.
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.681 183087 DEBUG nova.virt.libvirt.vif [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T08:46:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_enabled_dhcp4',display_name='tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_enabled_dhcp4',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-ovnextradhcpoptionstest-154566894-test-extra-dhcp-opts',id=16,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlGW+ytSlZhYgiyJ2VQXbYJFNaJudegvaO8arsRLbrIovURYus5bHVwpbhRN5uNXuMSQSnvVfbQff2hzPqkCc7D0963u0+MNy7sXLIRMgXOJArypDGQ7Qj3yNlotCHO0A==',key_name='tempest-OvnExtraDhcpOptionsTest-154566894',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71cced1777f24868932d789154ff04a0',ramdisk_id='',reservation_id='r-44ilk45b',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-OvnExtraDhcpOptionsTest-1195747792',owner_user_name='tempest-OvnExtraDhcpOptionsTest-1195747792-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:46:08Z,user_data=None,user_id='41a09f4d7f034b1c85f20c9512d33411',uuid=9001392f-3418-47ad-916e-500f88865c32,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dd68ce69-6136-4685-b76e-3d9ca72d5f1e", "address": "fa:16:3e:33:89:a6", "network": {"id": "025bb731-9b63-4a5d-b85e-b1832de66a48", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.4.0/24", "dns": [], "gateway": {"address": "192.168.4.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.4.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd68ce69-61", "ovs_interfaceid": "dd68ce69-6136-4685-b76e-3d9ca72d5f1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.682 183087 DEBUG nova.network.os_vif_util [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Converting VIF {"id": "dd68ce69-6136-4685-b76e-3d9ca72d5f1e", "address": "fa:16:3e:33:89:a6", "network": {"id": "025bb731-9b63-4a5d-b85e-b1832de66a48", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.4.0/24", "dns": [], "gateway": {"address": "192.168.4.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.4.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd68ce69-61", "ovs_interfaceid": "dd68ce69-6136-4685-b76e-3d9ca72d5f1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.683 183087 DEBUG nova.network.os_vif_util [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:89:a6,bridge_name='br-int',has_traffic_filtering=True,id=dd68ce69-6136-4685-b76e-3d9ca72d5f1e,network=Network(025bb731-9b63-4a5d-b85e-b1832de66a48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdd68ce69-61') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.684 183087 DEBUG os_vif [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:89:a6,bridge_name='br-int',has_traffic_filtering=True,id=dd68ce69-6136-4685-b76e-3d9ca72d5f1e,network=Network(025bb731-9b63-4a5d-b85e-b1832de66a48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdd68ce69-61') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.688 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.689 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd68ce69-61, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.689 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.697 183087 INFO os_vif [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:89:a6,bridge_name='br-int',has_traffic_filtering=True,id=dd68ce69-6136-4685-b76e-3d9ca72d5f1e,network=Network(025bb731-9b63-4a5d-b85e-b1832de66a48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdd68ce69-61')
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.698 183087 INFO nova.virt.libvirt.driver [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Deleting instance files /var/lib/nova/instances/9001392f-3418-47ad-916e-500f88865c32_del
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.699 183087 INFO nova.virt.libvirt.driver [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Deletion of /var/lib/nova/instances/9001392f-3418-47ad-916e-500f88865c32_del complete
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.740 183087 DEBUG nova.network.neutron [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Successfully updated port: b537c620-03b7-460a-a88d-1ea88e276912 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.761 183087 DEBUG oslo_concurrency.lockutils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "refresh_cache-1971a833-1317-47c9-8128-d849f8397edf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.762 183087 DEBUG oslo_concurrency.lockutils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquired lock "refresh_cache-1971a833-1317-47c9-8128-d849f8397edf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.762 183087 DEBUG nova.network.neutron [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.771 183087 INFO nova.compute.manager [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Took 0.10 seconds to destroy the instance on the hypervisor.
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.773 183087 DEBUG nova.compute.claims [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Aborting claim: <nova.compute.claims.Claim object at 0x7f6cb87639a0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.773 183087 DEBUG oslo_concurrency.lockutils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.774 183087 DEBUG oslo_concurrency.lockutils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.973 183087 DEBUG nova.compute.provider_tree [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:46:12 compute-1 nova_compute[183083]: 2026-01-26 08:46:12.994 183087 DEBUG nova.scheduler.client.report [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.023 183087 DEBUG oslo_concurrency.lockutils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.023 183087 DEBUG nova.compute.utils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.024 183087 ERROR nova.compute.manager [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Build of instance 9001392f-3418-47ad-916e-500f88865c32 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 9001392f-3418-47ad-916e-500f88865c32 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.024 183087 DEBUG nova.compute.manager [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.025 183087 DEBUG nova.virt.libvirt.vif [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-26T08:46:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_enabled_dhcp4',display_name='tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_enabled_dhcp4',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-ovnextradhcpoptionstest-154566894-test-extra-dhcp-opts',id=16,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlGW+ytSlZhYgiyJ2VQXbYJFNaJudegvaO8arsRLbrIovURYus5bHVwpbhRN5uNXuMSQSnvVfbQff2hzPqkCc7D0963u0+MNy7sXLIRMgXOJArypDGQ7Qj3yNlotCHO0A==',key_name='tempest-OvnExtraDhcpOptionsTest-154566894',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71cced1777f24868932d789154ff04a0',ramdisk_id='',reservation_id='r-44ilk45b',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-OvnExtraDhcpOptionsTest-1195747792',owner_user_name='tempest-OvnExtraDhcpOptionsTest-1195747792-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:46:12Z,user_data=None,user_id='41a09f4d7f034b1c85f20c9512d33411',uuid=9001392f-3418-47ad-916e-500f88865c32,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dd68ce69-6136-4685-b76e-3d9ca72d5f1e", "address": "fa:16:3e:33:89:a6", "network": {"id": "025bb731-9b63-4a5d-b85e-b1832de66a48", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.4.0/24", "dns": [], "gateway": {"address": "192.168.4.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.4.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd68ce69-61", "ovs_interfaceid": "dd68ce69-6136-4685-b76e-3d9ca72d5f1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.025 183087 DEBUG nova.network.os_vif_util [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Converting VIF {"id": "dd68ce69-6136-4685-b76e-3d9ca72d5f1e", "address": "fa:16:3e:33:89:a6", "network": {"id": "025bb731-9b63-4a5d-b85e-b1832de66a48", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.4.0/24", "dns": [], "gateway": {"address": "192.168.4.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.4.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd68ce69-61", "ovs_interfaceid": "dd68ce69-6136-4685-b76e-3d9ca72d5f1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.026 183087 DEBUG nova.network.os_vif_util [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:89:a6,bridge_name='br-int',has_traffic_filtering=True,id=dd68ce69-6136-4685-b76e-3d9ca72d5f1e,network=Network(025bb731-9b63-4a5d-b85e-b1832de66a48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdd68ce69-61') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.026 183087 DEBUG os_vif [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:89:a6,bridge_name='br-int',has_traffic_filtering=True,id=dd68ce69-6136-4685-b76e-3d9ca72d5f1e,network=Network(025bb731-9b63-4a5d-b85e-b1832de66a48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdd68ce69-61') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.027 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.028 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd68ce69-61, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.028 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.030 183087 INFO os_vif [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:89:a6,bridge_name='br-int',has_traffic_filtering=True,id=dd68ce69-6136-4685-b76e-3d9ca72d5f1e,network=Network(025bb731-9b63-4a5d-b85e-b1832de66a48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdd68ce69-61')
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.030 183087 DEBUG nova.compute.manager [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.030 183087 DEBUG nova.compute.manager [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.031 183087 DEBUG nova.network.neutron [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.485 183087 DEBUG nova.network.neutron [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.604 183087 DEBUG nova.compute.manager [req-9c4cb2b0-26d8-4341-8015-b5cc43b25568 req-628e9ad5-999e-484e-a223-ebfe87c15593 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Received event network-changed-b537c620-03b7-460a-a88d-1ea88e276912 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.605 183087 DEBUG nova.compute.manager [req-9c4cb2b0-26d8-4341-8015-b5cc43b25568 req-628e9ad5-999e-484e-a223-ebfe87c15593 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Refreshing instance network info cache due to event network-changed-b537c620-03b7-460a-a88d-1ea88e276912. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.606 183087 DEBUG oslo_concurrency.lockutils [req-9c4cb2b0-26d8-4341-8015-b5cc43b25568 req-628e9ad5-999e-484e-a223-ebfe87c15593 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-1971a833-1317-47c9-8128-d849f8397edf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.705 183087 DEBUG nova.network.neutron [req-b0d9df2a-9235-4525-86b9-029378f907ed req-8027f4f4-9f74-4715-9595-9e204394d830 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Updated VIF entry in instance network info cache for port 30b45c93-b3bb-44e6-8e4a-6903a631c773. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.706 183087 DEBUG nova.network.neutron [req-b0d9df2a-9235-4525-86b9-029378f907ed req-8027f4f4-9f74-4715-9595-9e204394d830 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Updating instance_info_cache with network_info: [{"id": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "address": "fa:16:3e:dd:ab:11", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b45c93-b3", "ovs_interfaceid": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "address": "fa:16:3e:81:c4:17", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bea873-e0", "ovs_interfaceid": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:13 compute-1 nova_compute[183083]: 2026-01-26 08:46:13.747 183087 DEBUG oslo_concurrency.lockutils [req-b0d9df2a-9235-4525-86b9-029378f907ed req-8027f4f4-9f74-4715-9595-9e204394d830 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:46:14 compute-1 nova_compute[183083]: 2026-01-26 08:46:14.545 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:14 compute-1 nova_compute[183083]: 2026-01-26 08:46:14.994 183087 DEBUG nova.network.neutron [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Updating instance_info_cache with network_info: [{"id": "b537c620-03b7-460a-a88d-1ea88e276912", "address": "fa:16:3e:b4:af:53", "network": {"id": "f2a994f0-d0da-401c-9c62-de655fe455dc", "bridge": "br-int", "label": "tempest-test-network--403502402", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.79", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb537c620-03", "ovs_interfaceid": "b537c620-03b7-460a-a88d-1ea88e276912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:14 compute-1 nova_compute[183083]: 2026-01-26 08:46:14.999 183087 DEBUG nova.network.neutron [req-d126b17f-fa00-4003-8104-ab7029379b63 req-7d351a5d-6745-4362-b1e8-bfd9e6234c8a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Updated VIF entry in instance network info cache for port dd68ce69-6136-4685-b76e-3d9ca72d5f1e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.000 183087 DEBUG nova.network.neutron [req-d126b17f-fa00-4003-8104-ab7029379b63 req-7d351a5d-6745-4362-b1e8-bfd9e6234c8a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Updating instance_info_cache with network_info: [{"id": "dd68ce69-6136-4685-b76e-3d9ca72d5f1e", "address": "fa:16:3e:33:89:a6", "network": {"id": "025bb731-9b63-4a5d-b85e-b1832de66a48", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.4.0/24", "dns": [], "gateway": {"address": "192.168.4.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.4.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd68ce69-61", "ovs_interfaceid": "dd68ce69-6136-4685-b76e-3d9ca72d5f1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.021 183087 DEBUG oslo_concurrency.lockutils [req-d126b17f-fa00-4003-8104-ab7029379b63 req-7d351a5d-6745-4362-b1e8-bfd9e6234c8a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-9001392f-3418-47ad-916e-500f88865c32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.026 183087 DEBUG oslo_concurrency.lockutils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Releasing lock "refresh_cache-1971a833-1317-47c9-8128-d849f8397edf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.026 183087 DEBUG nova.compute.manager [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Instance network_info: |[{"id": "b537c620-03b7-460a-a88d-1ea88e276912", "address": "fa:16:3e:b4:af:53", "network": {"id": "f2a994f0-d0da-401c-9c62-de655fe455dc", "bridge": "br-int", "label": "tempest-test-network--403502402", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.79", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb537c620-03", "ovs_interfaceid": "b537c620-03b7-460a-a88d-1ea88e276912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.027 183087 DEBUG oslo_concurrency.lockutils [req-9c4cb2b0-26d8-4341-8015-b5cc43b25568 req-628e9ad5-999e-484e-a223-ebfe87c15593 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-1971a833-1317-47c9-8128-d849f8397edf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.028 183087 DEBUG nova.network.neutron [req-9c4cb2b0-26d8-4341-8015-b5cc43b25568 req-628e9ad5-999e-484e-a223-ebfe87c15593 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Refreshing network info cache for port b537c620-03b7-460a-a88d-1ea88e276912 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.029 183087 INFO nova.compute.manager [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Terminating instance
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.031 183087 DEBUG nova.compute.manager [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.037 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 1971a833-1317-47c9-8128-d849f8397edf] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.037 183087 INFO nova.virt.libvirt.driver [-] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Instance destroyed successfully.
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.040 183087 DEBUG nova.virt.libvirt.vif [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:46:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_bw_limit_tenant_network-608595861',display_name='tempest-test_bw_limit_tenant_network-608595861',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-bw-limit-tenant-network-608595861',id=17,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHVjp+2pOh+xYUkttf/EHrrYAH3LBOn+IKLzf3fiQpiaJslqkY+OmJn6bfd2cX/NEPdTL45qAcY0Zt6OwZRQXbHCoOcvnydr7uXjZCoGXOxoNL1bEhwXU4AaOmmyDzyYAA==',key_name='tempest-keypair-test-1026532318',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b71ae2b9d2fd454b8b3b9aa1a0e5c7e4',ramdisk_id='',reservation_id='r-lh2ys2g4',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-374727467',owner_user_name='tempest-QosTestCommon-374727467-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:46:09Z,user_data=None,user_id='a7abeebb4e4d469c91e6cee77f6be1c3',uuid=1971a833-1317-47c9-8128-d849f8397edf,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b537c620-03b7-460a-a88d-1ea88e276912", "address": "fa:16:3e:b4:af:53", "network": {"id": "f2a994f0-d0da-401c-9c62-de655fe455dc", "bridge": "br-int", "label": "tempest-test-network--403502402", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.79", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb537c620-03", "ovs_interfaceid": "b537c620-03b7-460a-a88d-1ea88e276912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.040 183087 DEBUG nova.network.os_vif_util [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converting VIF {"id": "b537c620-03b7-460a-a88d-1ea88e276912", "address": "fa:16:3e:b4:af:53", "network": {"id": "f2a994f0-d0da-401c-9c62-de655fe455dc", "bridge": "br-int", "label": "tempest-test-network--403502402", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.79", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb537c620-03", "ovs_interfaceid": "b537c620-03b7-460a-a88d-1ea88e276912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.042 183087 DEBUG nova.network.os_vif_util [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:af:53,bridge_name='br-int',has_traffic_filtering=True,id=b537c620-03b7-460a-a88d-1ea88e276912,network=Network(f2a994f0-d0da-401c-9c62-de655fe455dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb537c620-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.042 183087 DEBUG os_vif [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:af:53,bridge_name='br-int',has_traffic_filtering=True,id=b537c620-03b7-460a-a88d-1ea88e276912,network=Network(f2a994f0-d0da-401c-9c62-de655fe455dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb537c620-03') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.045 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.046 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb537c620-03, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.047 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.054 183087 INFO os_vif [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:af:53,bridge_name='br-int',has_traffic_filtering=True,id=b537c620-03b7-460a-a88d-1ea88e276912,network=Network(f2a994f0-d0da-401c-9c62-de655fe455dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb537c620-03')
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.055 183087 INFO nova.virt.libvirt.driver [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Deleting instance files /var/lib/nova/instances/1971a833-1317-47c9-8128-d849f8397edf_del
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.056 183087 INFO nova.virt.libvirt.driver [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Deletion of /var/lib/nova/instances/1971a833-1317-47c9-8128-d849f8397edf_del complete
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.113 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769417160.1099753, 52d0b676-cf9c-4840-8b66-74ca8b13e2af => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.113 183087 INFO nova.compute.manager [-] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] VM Stopped (Lifecycle Event)
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.151 183087 DEBUG nova.compute.manager [None req-6c401724-69ce-4ed3-916a-4fd6f26f01de - - - - - -] [instance: 52d0b676-cf9c-4840-8b66-74ca8b13e2af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.193 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.195 183087 INFO nova.compute.manager [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Took 0.16 seconds to destroy the instance on the hypervisor.
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.196 183087 DEBUG nova.compute.claims [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c9836aaf0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.197 183087 DEBUG oslo_concurrency.lockutils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.197 183087 DEBUG oslo_concurrency.lockutils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.426 183087 DEBUG nova.compute.provider_tree [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.445 183087 DEBUG nova.scheduler.client.report [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.474 183087 DEBUG oslo_concurrency.lockutils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.475 183087 DEBUG nova.compute.utils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.476 183087 ERROR nova.compute.manager [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Build of instance 1971a833-1317-47c9-8128-d849f8397edf aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 1971a833-1317-47c9-8128-d849f8397edf aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.477 183087 DEBUG nova.compute.manager [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.478 183087 DEBUG nova.virt.libvirt.vif [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:46:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_bw_limit_tenant_network-608595861',display_name='tempest-test_bw_limit_tenant_network-608595861',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-test-bw-limit-tenant-network-608595861',id=17,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHVjp+2pOh+xYUkttf/EHrrYAH3LBOn+IKLzf3fiQpiaJslqkY+OmJn6bfd2cX/NEPdTL45qAcY0Zt6OwZRQXbHCoOcvnydr7uXjZCoGXOxoNL1bEhwXU4AaOmmyDzyYAA==',key_name='tempest-keypair-test-1026532318',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b71ae2b9d2fd454b8b3b9aa1a0e5c7e4',ramdisk_id='',reservation_id='r-lh2ys2g4',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-374727467',owner_user_name='tempest-QosTestCommon-374727467-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:46:15Z,user_data=None,user_id='a7abeebb4e4d469c91e6cee77f6be1c3',uuid=1971a833-1317-47c9-8128-d849f8397edf,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b537c620-03b7-460a-a88d-1ea88e276912", "address": "fa:16:3e:b4:af:53", "network": {"id": "f2a994f0-d0da-401c-9c62-de655fe455dc", "bridge": "br-int", "label": "tempest-test-network--403502402", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.79", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb537c620-03", "ovs_interfaceid": "b537c620-03b7-460a-a88d-1ea88e276912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.479 183087 DEBUG nova.network.os_vif_util [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converting VIF {"id": "b537c620-03b7-460a-a88d-1ea88e276912", "address": "fa:16:3e:b4:af:53", "network": {"id": "f2a994f0-d0da-401c-9c62-de655fe455dc", "bridge": "br-int", "label": "tempest-test-network--403502402", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.79", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb537c620-03", "ovs_interfaceid": "b537c620-03b7-460a-a88d-1ea88e276912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.480 183087 DEBUG nova.network.os_vif_util [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:af:53,bridge_name='br-int',has_traffic_filtering=True,id=b537c620-03b7-460a-a88d-1ea88e276912,network=Network(f2a994f0-d0da-401c-9c62-de655fe455dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb537c620-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.480 183087 DEBUG os_vif [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:af:53,bridge_name='br-int',has_traffic_filtering=True,id=b537c620-03b7-460a-a88d-1ea88e276912,network=Network(f2a994f0-d0da-401c-9c62-de655fe455dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb537c620-03') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.484 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.485 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb537c620-03, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.485 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.489 183087 INFO os_vif [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:af:53,bridge_name='br-int',has_traffic_filtering=True,id=b537c620-03b7-460a-a88d-1ea88e276912,network=Network(f2a994f0-d0da-401c-9c62-de655fe455dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb537c620-03')
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.489 183087 DEBUG nova.compute.manager [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.490 183087 DEBUG nova.compute.manager [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.490 183087 DEBUG nova.network.neutron [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.735 183087 DEBUG nova.network.neutron [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.755 183087 INFO nova.compute.manager [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 9001392f-3418-47ad-916e-500f88865c32] Took 2.72 seconds to deallocate network for instance.
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.984 183087 INFO nova.scheduler.client.report [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Deleted allocations for instance 9001392f-3418-47ad-916e-500f88865c32
Jan 26 08:46:15 compute-1 nova_compute[183083]: 2026-01-26 08:46:15.985 183087 DEBUG oslo_concurrency.lockutils [None req-cd2516e4-e31c-4997-b4a7-1a58b48e40fc 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "9001392f-3418-47ad-916e-500f88865c32" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.397s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:16 compute-1 podman[213725]: 2026-01-26 08:46:16.834748706 +0000 UTC m=+0.087831867 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-type=git, vendor=Red Hat, Inc.)
Jan 26 08:46:16 compute-1 podman[213724]: 2026-01-26 08:46:16.845620184 +0000 UTC m=+0.106504207 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 26 08:46:16 compute-1 nova_compute[183083]: 2026-01-26 08:46:16.957 183087 DEBUG nova.network.neutron [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:16 compute-1 nova_compute[183083]: 2026-01-26 08:46:16.995 183087 INFO nova.compute.manager [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Took 1.50 seconds to deallocate network for instance.
Jan 26 08:46:17 compute-1 nova_compute[183083]: 2026-01-26 08:46:17.211 183087 INFO nova.scheduler.client.report [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Deleted allocations for instance 1971a833-1317-47c9-8128-d849f8397edf
Jan 26 08:46:17 compute-1 nova_compute[183083]: 2026-01-26 08:46:17.212 183087 DEBUG oslo_concurrency.lockutils [None req-6ec25319-b7bb-4a89-9b73-3c6d0cbf856e a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "1971a833-1317-47c9-8128-d849f8397edf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.989s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:17 compute-1 nova_compute[183083]: 2026-01-26 08:46:17.281 183087 DEBUG nova.network.neutron [req-9c4cb2b0-26d8-4341-8015-b5cc43b25568 req-628e9ad5-999e-484e-a223-ebfe87c15593 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Updated VIF entry in instance network info cache for port b537c620-03b7-460a-a88d-1ea88e276912. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:46:17 compute-1 nova_compute[183083]: 2026-01-26 08:46:17.282 183087 DEBUG nova.network.neutron [req-9c4cb2b0-26d8-4341-8015-b5cc43b25568 req-628e9ad5-999e-484e-a223-ebfe87c15593 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 1971a833-1317-47c9-8128-d849f8397edf] Updating instance_info_cache with network_info: [{"id": "b537c620-03b7-460a-a88d-1ea88e276912", "address": "fa:16:3e:b4:af:53", "network": {"id": "f2a994f0-d0da-401c-9c62-de655fe455dc", "bridge": "br-int", "label": "tempest-test-network--403502402", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.79", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb537c620-03", "ovs_interfaceid": "b537c620-03b7-460a-a88d-1ea88e276912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:17 compute-1 nova_compute[183083]: 2026-01-26 08:46:17.303 183087 DEBUG oslo_concurrency.lockutils [req-9c4cb2b0-26d8-4341-8015-b5cc43b25568 req-628e9ad5-999e-484e-a223-ebfe87c15593 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-1971a833-1317-47c9-8128-d849f8397edf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:46:17 compute-1 nova_compute[183083]: 2026-01-26 08:46:17.800 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:18 compute-1 nova_compute[183083]: 2026-01-26 08:46:18.977 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:18 compute-1 nova_compute[183083]: 2026-01-26 08:46:18.978 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:18 compute-1 nova_compute[183083]: 2026-01-26 08:46:18.999 183087 DEBUG nova.compute.manager [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.065 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.066 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:19 compute-1 ovn_controller[95352]: 2026-01-26T08:46:19Z|00090|binding|INFO|Releasing lport 809259ab-8ea4-4909-92b4-4ee536a51482 from this chassis (sb_readonly=0)
Jan 26 08:46:19 compute-1 ovn_controller[95352]: 2026-01-26T08:46:19Z|00091|binding|INFO|Releasing lport bfb9744a-58f0-4145-9e28-9c13225b3407 from this chassis (sb_readonly=0)
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.076 183087 DEBUG nova.virt.hardware [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.076 183087 INFO nova.compute.claims [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.236 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.296 183087 DEBUG nova.compute.provider_tree [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.307 183087 DEBUG nova.scheduler.client.report [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.324 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.325 183087 DEBUG nova.compute.manager [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.382 183087 DEBUG nova.compute.manager [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.383 183087 DEBUG nova.network.neutron [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.403 183087 INFO nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.438 183087 DEBUG nova.compute.manager [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.547 183087 DEBUG nova.compute.manager [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.550 183087 DEBUG nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.551 183087 INFO nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Creating image(s)
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.552 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "/var/lib/nova/instances/f2de61ba-4b20-445c-bdbb-44bb79fb58c3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.553 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "/var/lib/nova/instances/f2de61ba-4b20-445c-bdbb-44bb79fb58c3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.554 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "/var/lib/nova/instances/f2de61ba-4b20-445c-bdbb-44bb79fb58c3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.577 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.582 183087 DEBUG oslo_concurrency.processutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.658 183087 DEBUG oslo_concurrency.processutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.660 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.661 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.684 183087 DEBUG oslo_concurrency.processutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.760 183087 DEBUG oslo_concurrency.processutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.762 183087 DEBUG oslo_concurrency.processutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/f2de61ba-4b20-445c-bdbb-44bb79fb58c3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.821 183087 DEBUG oslo_concurrency.processutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/f2de61ba-4b20-445c-bdbb-44bb79fb58c3/disk 1073741824" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.823 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.825 183087 DEBUG oslo_concurrency.processutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.894 183087 DEBUG oslo_concurrency.processutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.895 183087 DEBUG nova.virt.disk.api [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Checking if we can resize image /var/lib/nova/instances/f2de61ba-4b20-445c-bdbb-44bb79fb58c3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.896 183087 DEBUG oslo_concurrency.processutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2de61ba-4b20-445c-bdbb-44bb79fb58c3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.992 183087 DEBUG oslo_concurrency.processutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2de61ba-4b20-445c-bdbb-44bb79fb58c3/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.994 183087 DEBUG nova.virt.disk.api [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Cannot resize image /var/lib/nova/instances/f2de61ba-4b20-445c-bdbb-44bb79fb58c3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 08:46:19 compute-1 nova_compute[183083]: 2026-01-26 08:46:19.995 183087 DEBUG nova.objects.instance [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lazy-loading 'migration_context' on Instance uuid f2de61ba-4b20-445c-bdbb-44bb79fb58c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:46:20 compute-1 nova_compute[183083]: 2026-01-26 08:46:20.012 183087 DEBUG nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 08:46:20 compute-1 nova_compute[183083]: 2026-01-26 08:46:20.013 183087 DEBUG nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Ensure instance console log exists: /var/lib/nova/instances/f2de61ba-4b20-445c-bdbb-44bb79fb58c3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 08:46:20 compute-1 nova_compute[183083]: 2026-01-26 08:46:20.014 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:20 compute-1 nova_compute[183083]: 2026-01-26 08:46:20.015 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:20 compute-1 nova_compute[183083]: 2026-01-26 08:46:20.015 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:20 compute-1 nova_compute[183083]: 2026-01-26 08:46:20.196 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:20 compute-1 nova_compute[183083]: 2026-01-26 08:46:20.591 183087 DEBUG nova.policy [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.213 183087 DEBUG oslo_concurrency.lockutils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Acquiring lock "d6141ab1-a933-4709-8f5b-f95abb9902ff" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.214 183087 DEBUG oslo_concurrency.lockutils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "d6141ab1-a933-4709-8f5b-f95abb9902ff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.233 183087 DEBUG nova.compute.manager [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.336 183087 DEBUG oslo_concurrency.lockutils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.337 183087 DEBUG oslo_concurrency.lockutils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.347 183087 DEBUG nova.virt.hardware [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.348 183087 INFO nova.compute.claims [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.529 183087 DEBUG nova.compute.provider_tree [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.564 183087 DEBUG nova.scheduler.client.report [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.588 183087 DEBUG oslo_concurrency.lockutils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.589 183087 DEBUG nova.compute.manager [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.632 183087 DEBUG nova.compute.manager [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.633 183087 DEBUG nova.network.neutron [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.657 183087 INFO nova.virt.libvirt.driver [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.689 183087 DEBUG nova.compute.manager [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.798 183087 DEBUG nova.compute.manager [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.800 183087 DEBUG nova.virt.libvirt.driver [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.800 183087 INFO nova.virt.libvirt.driver [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Creating image(s)
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.801 183087 DEBUG oslo_concurrency.lockutils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Acquiring lock "/var/lib/nova/instances/d6141ab1-a933-4709-8f5b-f95abb9902ff/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.802 183087 DEBUG oslo_concurrency.lockutils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "/var/lib/nova/instances/d6141ab1-a933-4709-8f5b-f95abb9902ff/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.803 183087 DEBUG oslo_concurrency.lockutils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "/var/lib/nova/instances/d6141ab1-a933-4709-8f5b-f95abb9902ff/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.804 183087 DEBUG oslo_concurrency.lockutils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.804 183087 DEBUG oslo_concurrency.lockutils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:22 compute-1 nova_compute[183083]: 2026-01-26 08:46:22.902 183087 DEBUG nova.policy [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'add713470fcc438f95ec0ff89dbb2adc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a3694415e0ac483fa070e7316b146fc1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.199 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.826 183087 DEBUG nova.network.neutron [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Successfully updated port: dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:46:23 compute-1 podman[213779]: 2026-01-26 08:46:23.827487261 +0000 UTC m=+0.085144360 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 08:46:23 compute-1 podman[213778]: 2026-01-26 08:46:23.866823999 +0000 UTC m=+0.128227104 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 DEBUG oslo_concurrency.lockutils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Traceback (most recent call last):
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]     raise exception.ImageUnacceptable(
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] 
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] During handling of the above exception, another exception occurred:
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] 
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Traceback (most recent call last):
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]     yield resources
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]     created_disks = self._create_and_inject_local_root(
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]     image.cache(fetch_func=fetch_func,
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]     return f(*args, **kwargs)
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff]     raise exception.ImageUnacceptable(
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:46:23 compute-1 nova_compute[183083]: 2026-01-26 08:46:23.978 183087 ERROR nova.compute.manager [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] 
Jan 26 08:46:24 compute-1 nova_compute[183083]: 2026-01-26 08:46:24.045 183087 DEBUG nova.compute.manager [req-c2562c95-3368-46ef-ba19-90fd75210b04 req-521e1680-52ea-4b2f-8cdb-a7bc79183e9e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Received event network-changed-dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:46:24 compute-1 nova_compute[183083]: 2026-01-26 08:46:24.045 183087 DEBUG nova.compute.manager [req-c2562c95-3368-46ef-ba19-90fd75210b04 req-521e1680-52ea-4b2f-8cdb-a7bc79183e9e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Refreshing instance network info cache due to event network-changed-dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:46:24 compute-1 nova_compute[183083]: 2026-01-26 08:46:24.046 183087 DEBUG oslo_concurrency.lockutils [req-c2562c95-3368-46ef-ba19-90fd75210b04 req-521e1680-52ea-4b2f-8cdb-a7bc79183e9e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-f2de61ba-4b20-445c-bdbb-44bb79fb58c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:46:24 compute-1 nova_compute[183083]: 2026-01-26 08:46:24.046 183087 DEBUG oslo_concurrency.lockutils [req-c2562c95-3368-46ef-ba19-90fd75210b04 req-521e1680-52ea-4b2f-8cdb-a7bc79183e9e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-f2de61ba-4b20-445c-bdbb-44bb79fb58c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:46:24 compute-1 nova_compute[183083]: 2026-01-26 08:46:24.047 183087 DEBUG nova.network.neutron [req-c2562c95-3368-46ef-ba19-90fd75210b04 req-521e1680-52ea-4b2f-8cdb-a7bc79183e9e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Refreshing network info cache for port dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:46:24 compute-1 nova_compute[183083]: 2026-01-26 08:46:24.512 183087 DEBUG nova.network.neutron [req-c2562c95-3368-46ef-ba19-90fd75210b04 req-521e1680-52ea-4b2f-8cdb-a7bc79183e9e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:46:24 compute-1 nova_compute[183083]: 2026-01-26 08:46:24.552 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:25 compute-1 nova_compute[183083]: 2026-01-26 08:46:25.200 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:25 compute-1 nova_compute[183083]: 2026-01-26 08:46:25.744 183087 DEBUG nova.network.neutron [req-c2562c95-3368-46ef-ba19-90fd75210b04 req-521e1680-52ea-4b2f-8cdb-a7bc79183e9e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:25 compute-1 nova_compute[183083]: 2026-01-26 08:46:25.759 183087 DEBUG oslo_concurrency.lockutils [req-c2562c95-3368-46ef-ba19-90fd75210b04 req-521e1680-52ea-4b2f-8cdb-a7bc79183e9e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-f2de61ba-4b20-445c-bdbb-44bb79fb58c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:46:25 compute-1 podman[213825]: 2026-01-26 08:46:25.834064744 +0000 UTC m=+0.079534421 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 26 08:46:26 compute-1 nova_compute[183083]: 2026-01-26 08:46:26.865 183087 DEBUG nova.network.neutron [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Successfully created port: 9a499fec-d7c8-40bf-a913-c25d3ee8a2fc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:46:26 compute-1 ovn_controller[95352]: 2026-01-26T08:46:26Z|00092|binding|INFO|Releasing lport 809259ab-8ea4-4909-92b4-4ee536a51482 from this chassis (sb_readonly=0)
Jan 26 08:46:26 compute-1 ovn_controller[95352]: 2026-01-26T08:46:26Z|00093|binding|INFO|Releasing lport bfb9744a-58f0-4145-9e28-9c13225b3407 from this chassis (sb_readonly=0)
Jan 26 08:46:27 compute-1 nova_compute[183083]: 2026-01-26 08:46:27.023 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:29 compute-1 nova_compute[183083]: 2026-01-26 08:46:29.129 183087 DEBUG nova.network.neutron [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Successfully updated port: a08840be-e8e9-48ba-9780-050a896d2732 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:46:29 compute-1 nova_compute[183083]: 2026-01-26 08:46:29.160 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "refresh_cache-f2de61ba-4b20-445c-bdbb-44bb79fb58c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:46:29 compute-1 nova_compute[183083]: 2026-01-26 08:46:29.161 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquired lock "refresh_cache-f2de61ba-4b20-445c-bdbb-44bb79fb58c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:46:29 compute-1 nova_compute[183083]: 2026-01-26 08:46:29.162 183087 DEBUG nova.network.neutron [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:46:29 compute-1 nova_compute[183083]: 2026-01-26 08:46:29.267 183087 DEBUG nova.compute.manager [req-68778ab1-2d36-4512-812c-a62fbf62c8a2 req-111b8619-5742-4f13-a54e-9a99f44759b9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Received event network-changed-a08840be-e8e9-48ba-9780-050a896d2732 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:46:29 compute-1 nova_compute[183083]: 2026-01-26 08:46:29.267 183087 DEBUG nova.compute.manager [req-68778ab1-2d36-4512-812c-a62fbf62c8a2 req-111b8619-5742-4f13-a54e-9a99f44759b9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Refreshing instance network info cache due to event network-changed-a08840be-e8e9-48ba-9780-050a896d2732. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:46:29 compute-1 nova_compute[183083]: 2026-01-26 08:46:29.268 183087 DEBUG oslo_concurrency.lockutils [req-68778ab1-2d36-4512-812c-a62fbf62c8a2 req-111b8619-5742-4f13-a54e-9a99f44759b9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-f2de61ba-4b20-445c-bdbb-44bb79fb58c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:46:29 compute-1 ovn_controller[95352]: 2026-01-26T08:46:29Z|00094|binding|INFO|Releasing lport 809259ab-8ea4-4909-92b4-4ee536a51482 from this chassis (sb_readonly=0)
Jan 26 08:46:29 compute-1 ovn_controller[95352]: 2026-01-26T08:46:29Z|00095|binding|INFO|Releasing lport bfb9744a-58f0-4145-9e28-9c13225b3407 from this chassis (sb_readonly=0)
Jan 26 08:46:29 compute-1 nova_compute[183083]: 2026-01-26 08:46:29.428 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:29 compute-1 nova_compute[183083]: 2026-01-26 08:46:29.554 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:29 compute-1 nova_compute[183083]: 2026-01-26 08:46:29.806 183087 DEBUG nova.network.neutron [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:46:30 compute-1 nova_compute[183083]: 2026-01-26 08:46:30.203 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:32 compute-1 nova_compute[183083]: 2026-01-26 08:46:32.103 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:32 compute-1 nova_compute[183083]: 2026-01-26 08:46:32.708 183087 DEBUG nova.network.neutron [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Successfully updated port: 9a499fec-d7c8-40bf-a913-c25d3ee8a2fc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:46:32 compute-1 nova_compute[183083]: 2026-01-26 08:46:32.739 183087 DEBUG oslo_concurrency.lockutils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Acquiring lock "refresh_cache-d6141ab1-a933-4709-8f5b-f95abb9902ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:46:32 compute-1 nova_compute[183083]: 2026-01-26 08:46:32.739 183087 DEBUG oslo_concurrency.lockutils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Acquired lock "refresh_cache-d6141ab1-a933-4709-8f5b-f95abb9902ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:46:32 compute-1 nova_compute[183083]: 2026-01-26 08:46:32.739 183087 DEBUG nova.network.neutron [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:46:32 compute-1 nova_compute[183083]: 2026-01-26 08:46:32.834 183087 DEBUG nova.compute.manager [req-546b703d-4f48-46be-8aad-708bc069c82f req-d16a9b23-63c5-444e-a177-46b8094c531e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Received event network-changed-9a499fec-d7c8-40bf-a913-c25d3ee8a2fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:46:32 compute-1 nova_compute[183083]: 2026-01-26 08:46:32.834 183087 DEBUG nova.compute.manager [req-546b703d-4f48-46be-8aad-708bc069c82f req-d16a9b23-63c5-444e-a177-46b8094c531e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Refreshing instance network info cache due to event network-changed-9a499fec-d7c8-40bf-a913-c25d3ee8a2fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:46:32 compute-1 nova_compute[183083]: 2026-01-26 08:46:32.835 183087 DEBUG oslo_concurrency.lockutils [req-546b703d-4f48-46be-8aad-708bc069c82f req-d16a9b23-63c5-444e-a177-46b8094c531e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-d6141ab1-a933-4709-8f5b-f95abb9902ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.427 183087 DEBUG nova.network.neutron [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.734 183087 DEBUG nova.network.neutron [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Updating instance_info_cache with network_info: [{"id": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "address": "fa:16:3e:09:0a:fa", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.164", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd638f7c-8f", "ovs_interfaceid": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "a08840be-e8e9-48ba-9780-050a896d2732", "address": "fa:16:3e:a2:ff:78", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08840be-e8", "ovs_interfaceid": "a08840be-e8e9-48ba-9780-050a896d2732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.753 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Releasing lock "refresh_cache-f2de61ba-4b20-445c-bdbb-44bb79fb58c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.753 183087 DEBUG nova.compute.manager [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Instance network_info: |[{"id": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "address": "fa:16:3e:09:0a:fa", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.164", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd638f7c-8f", "ovs_interfaceid": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "a08840be-e8e9-48ba-9780-050a896d2732", "address": "fa:16:3e:a2:ff:78", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08840be-e8", "ovs_interfaceid": "a08840be-e8e9-48ba-9780-050a896d2732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.754 183087 DEBUG oslo_concurrency.lockutils [req-68778ab1-2d36-4512-812c-a62fbf62c8a2 req-111b8619-5742-4f13-a54e-9a99f44759b9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-f2de61ba-4b20-445c-bdbb-44bb79fb58c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.754 183087 DEBUG nova.network.neutron [req-68778ab1-2d36-4512-812c-a62fbf62c8a2 req-111b8619-5742-4f13-a54e-9a99f44759b9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Refreshing network info cache for port a08840be-e8e9-48ba-9780-050a896d2732 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.758 183087 DEBUG nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Start _get_guest_xml network_info=[{"id": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "address": "fa:16:3e:09:0a:fa", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.164", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd638f7c-8f", "ovs_interfaceid": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "a08840be-e8e9-48ba-9780-050a896d2732", "address": "fa:16:3e:a2:ff:78", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08840be-e8", "ovs_interfaceid": "a08840be-e8e9-48ba-9780-050a896d2732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encryption_options': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.762 183087 WARNING nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.768 183087 DEBUG nova.virt.libvirt.host [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.769 183087 DEBUG nova.virt.libvirt.host [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.772 183087 DEBUG nova.virt.libvirt.host [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.772 183087 DEBUG nova.virt.libvirt.host [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.773 183087 DEBUG nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.773 183087 DEBUG nova.virt.hardware [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T08:43:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7017323-cd35-470d-a718-faa6c6e97277',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.774 183087 DEBUG nova.virt.hardware [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.774 183087 DEBUG nova.virt.hardware [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.774 183087 DEBUG nova.virt.hardware [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.774 183087 DEBUG nova.virt.hardware [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.775 183087 DEBUG nova.virt.hardware [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.775 183087 DEBUG nova.virt.hardware [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.775 183087 DEBUG nova.virt.hardware [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.776 183087 DEBUG nova.virt.hardware [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.776 183087 DEBUG nova.virt.hardware [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.776 183087 DEBUG nova.virt.hardware [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.780 183087 DEBUG nova.virt.libvirt.vif [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:46:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1494741620',display_name='tempest-server-test-1494741620',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1494741620',id=18,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBONV6jGPWIKrQk26JKZ8H9h2iNqZUCOpmao7Jq+9fEiq3iYmdJdZreCUr9V3PbbF1TPAOou07OOnfHkvbrEzcfNM5ieiMGZPqHbawuPIe3wilad9S814UZ1oxvh/DW+nZg==',key_name='tempest-keypair-test-867467517',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4a559c36b13649d98b2995c099340eb9',ramdisk_id='',reservation_id='r-bo90ukz1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-508365101',owner_user_name='tempest-PortSecurityTest-508365101-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:46:19Z,user_data=None,user_id='52d582094c584036ba3e04c9da69ee02',uuid=f2de61ba-4b20-445c-bdbb-44bb79fb58c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "address": "fa:16:3e:09:0a:fa", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.164", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd638f7c-8f", "ovs_interfaceid": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.780 183087 DEBUG nova.network.os_vif_util [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converting VIF {"id": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "address": "fa:16:3e:09:0a:fa", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.164", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd638f7c-8f", "ovs_interfaceid": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.781 183087 DEBUG nova.network.os_vif_util [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:0a:fa,bridge_name='br-int',has_traffic_filtering=True,id=dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd,network=Network(bad39ade-29c7-41d5-89dd-fc1845e5f3f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdd638f7c-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.782 183087 DEBUG nova.virt.libvirt.vif [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:46:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1494741620',display_name='tempest-server-test-1494741620',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1494741620',id=18,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBONV6jGPWIKrQk26JKZ8H9h2iNqZUCOpmao7Jq+9fEiq3iYmdJdZreCUr9V3PbbF1TPAOou07OOnfHkvbrEzcfNM5ieiMGZPqHbawuPIe3wilad9S814UZ1oxvh/DW+nZg==',key_name='tempest-keypair-test-867467517',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4a559c36b13649d98b2995c099340eb9',ramdisk_id='',reservation_id='r-bo90ukz1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-508365101',owner_user_name='tempest-PortSecurityTest-508365101-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:46:19Z,user_data=None,user_id='52d582094c584036ba3e04c9da69ee02',uuid=f2de61ba-4b20-445c-bdbb-44bb79fb58c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a08840be-e8e9-48ba-9780-050a896d2732", "address": "fa:16:3e:a2:ff:78", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08840be-e8", "ovs_interfaceid": "a08840be-e8e9-48ba-9780-050a896d2732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.782 183087 DEBUG nova.network.os_vif_util [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converting VIF {"id": "a08840be-e8e9-48ba-9780-050a896d2732", "address": "fa:16:3e:a2:ff:78", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08840be-e8", "ovs_interfaceid": "a08840be-e8e9-48ba-9780-050a896d2732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.783 183087 DEBUG nova.network.os_vif_util [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:ff:78,bridge_name='br-int',has_traffic_filtering=True,id=a08840be-e8e9-48ba-9780-050a896d2732,network=Network(410ad2c8-60c1-40d5-855c-7deeb749f0fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa08840be-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.784 183087 DEBUG nova.objects.instance [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lazy-loading 'pci_devices' on Instance uuid f2de61ba-4b20-445c-bdbb-44bb79fb58c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.796 183087 DEBUG nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] End _get_guest_xml xml=<domain type="kvm">
Jan 26 08:46:33 compute-1 nova_compute[183083]:   <uuid>f2de61ba-4b20-445c-bdbb-44bb79fb58c3</uuid>
Jan 26 08:46:33 compute-1 nova_compute[183083]:   <name>instance-00000012</name>
Jan 26 08:46:33 compute-1 nova_compute[183083]:   <memory>131072</memory>
Jan 26 08:46:33 compute-1 nova_compute[183083]:   <vcpu>1</vcpu>
Jan 26 08:46:33 compute-1 nova_compute[183083]:   <metadata>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <nova:name>tempest-server-test-1494741620</nova:name>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <nova:creationTime>2026-01-26 08:46:33</nova:creationTime>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <nova:flavor name="m1.nano">
Jan 26 08:46:33 compute-1 nova_compute[183083]:         <nova:memory>128</nova:memory>
Jan 26 08:46:33 compute-1 nova_compute[183083]:         <nova:disk>1</nova:disk>
Jan 26 08:46:33 compute-1 nova_compute[183083]:         <nova:swap>0</nova:swap>
Jan 26 08:46:33 compute-1 nova_compute[183083]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 08:46:33 compute-1 nova_compute[183083]:         <nova:vcpus>1</nova:vcpus>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       </nova:flavor>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <nova:owner>
Jan 26 08:46:33 compute-1 nova_compute[183083]:         <nova:user uuid="52d582094c584036ba3e04c9da69ee02">tempest-PortSecurityTest-508365101-project-member</nova:user>
Jan 26 08:46:33 compute-1 nova_compute[183083]:         <nova:project uuid="4a559c36b13649d98b2995c099340eb9">tempest-PortSecurityTest-508365101</nova:project>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       </nova:owner>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <nova:root type="image" uuid="13d1a20a-8003-4f19-aba7-ccbd9eff9b82"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <nova:ports>
Jan 26 08:46:33 compute-1 nova_compute[183083]:         <nova:port uuid="dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd">
Jan 26 08:46:33 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="192.168.0.164" ipVersion="4"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 08:46:33 compute-1 nova_compute[183083]:         <nova:port uuid="a08840be-e8e9-48ba-9780-050a896d2732">
Jan 26 08:46:33 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="192.168.1.12" ipVersion="4"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       </nova:ports>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     </nova:instance>
Jan 26 08:46:33 compute-1 nova_compute[183083]:   </metadata>
Jan 26 08:46:33 compute-1 nova_compute[183083]:   <sysinfo type="smbios">
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <system>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <entry name="manufacturer">RDO</entry>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <entry name="product">OpenStack Compute</entry>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <entry name="serial">f2de61ba-4b20-445c-bdbb-44bb79fb58c3</entry>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <entry name="uuid">f2de61ba-4b20-445c-bdbb-44bb79fb58c3</entry>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <entry name="family">Virtual Machine</entry>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     </system>
Jan 26 08:46:33 compute-1 nova_compute[183083]:   </sysinfo>
Jan 26 08:46:33 compute-1 nova_compute[183083]:   <os>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <boot dev="hd"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <smbios mode="sysinfo"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:   </os>
Jan 26 08:46:33 compute-1 nova_compute[183083]:   <features>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <acpi/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <apic/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <vmcoreinfo/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:   </features>
Jan 26 08:46:33 compute-1 nova_compute[183083]:   <clock offset="utc">
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <timer name="hpet" present="no"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:   </clock>
Jan 26 08:46:33 compute-1 nova_compute[183083]:   <cpu mode="host-model" match="exact">
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:   </cpu>
Jan 26 08:46:33 compute-1 nova_compute[183083]:   <devices>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <disk type="file" device="disk">
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/f2de61ba-4b20-445c-bdbb-44bb79fb58c3/disk"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <target dev="vda" bus="virtio"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <disk type="file" device="cdrom">
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/f2de61ba-4b20-445c-bdbb-44bb79fb58c3/disk.config"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <target dev="sda" bus="sata"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:09:0a:fa"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <target dev="tapdd638f7c-8f"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:a2:ff:78"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <target dev="tapa08840be-e8"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <serial type="pty">
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <log file="/var/lib/nova/instances/f2de61ba-4b20-445c-bdbb-44bb79fb58c3/console.log" append="off"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     </serial>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <video>
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     </video>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <input type="tablet" bus="usb"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <rng model="virtio">
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <backend model="random">/dev/urandom</backend>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     </rng>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <controller type="usb" index="0"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     <memballoon model="virtio">
Jan 26 08:46:33 compute-1 nova_compute[183083]:       <stats period="10"/>
Jan 26 08:46:33 compute-1 nova_compute[183083]:     </memballoon>
Jan 26 08:46:33 compute-1 nova_compute[183083]:   </devices>
Jan 26 08:46:33 compute-1 nova_compute[183083]: </domain>
Jan 26 08:46:33 compute-1 nova_compute[183083]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.798 183087 DEBUG nova.compute.manager [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Preparing to wait for external event network-vif-plugged-dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.798 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.799 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.799 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.799 183087 DEBUG nova.compute.manager [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Preparing to wait for external event network-vif-plugged-a08840be-e8e9-48ba-9780-050a896d2732 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.800 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.800 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.800 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.801 183087 DEBUG nova.virt.libvirt.vif [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:46:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1494741620',display_name='tempest-server-test-1494741620',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1494741620',id=18,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBONV6jGPWIKrQk26JKZ8H9h2iNqZUCOpmao7Jq+9fEiq3iYmdJdZreCUr9V3PbbF1TPAOou07OOnfHkvbrEzcfNM5ieiMGZPqHbawuPIe3wilad9S814UZ1oxvh/DW+nZg==',key_name='tempest-keypair-test-867467517',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4a559c36b13649d98b2995c099340eb9',ramdisk_id='',reservation_id='r-bo90ukz1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-508365101',owner_user_name='tempest-PortSecurityTest-508365101-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:46:19Z,user_data=None,user_id='52d582094c584036ba3e04c9da69ee02',uuid=f2de61ba-4b20-445c-bdbb-44bb79fb58c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "address": "fa:16:3e:09:0a:fa", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.164", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd638f7c-8f", "ovs_interfaceid": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.801 183087 DEBUG nova.network.os_vif_util [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converting VIF {"id": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "address": "fa:16:3e:09:0a:fa", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.164", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd638f7c-8f", "ovs_interfaceid": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.802 183087 DEBUG nova.network.os_vif_util [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:0a:fa,bridge_name='br-int',has_traffic_filtering=True,id=dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd,network=Network(bad39ade-29c7-41d5-89dd-fc1845e5f3f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdd638f7c-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.802 183087 DEBUG os_vif [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:0a:fa,bridge_name='br-int',has_traffic_filtering=True,id=dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd,network=Network(bad39ade-29c7-41d5-89dd-fc1845e5f3f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdd638f7c-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.803 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.804 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.804 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.806 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.807 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd638f7c-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.807 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdd638f7c-8f, col_values=(('external_ids', {'iface-id': 'dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:09:0a:fa', 'vm-uuid': 'f2de61ba-4b20-445c-bdbb-44bb79fb58c3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:33 compute-1 NetworkManager[55451]: <info>  [1769417193.8101] manager: (tapdd638f7c-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.809 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.813 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.815 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.816 183087 INFO os_vif [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:0a:fa,bridge_name='br-int',has_traffic_filtering=True,id=dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd,network=Network(bad39ade-29c7-41d5-89dd-fc1845e5f3f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdd638f7c-8f')
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.817 183087 DEBUG nova.virt.libvirt.vif [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:46:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1494741620',display_name='tempest-server-test-1494741620',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1494741620',id=18,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBONV6jGPWIKrQk26JKZ8H9h2iNqZUCOpmao7Jq+9fEiq3iYmdJdZreCUr9V3PbbF1TPAOou07OOnfHkvbrEzcfNM5ieiMGZPqHbawuPIe3wilad9S814UZ1oxvh/DW+nZg==',key_name='tempest-keypair-test-867467517',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4a559c36b13649d98b2995c099340eb9',ramdisk_id='',reservation_id='r-bo90ukz1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-508365101',owner_user_name='tempest-PortSecurityTest-508365101-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:46:19Z,user_data=None,user_id='52d582094c584036ba3e04c9da69ee02',uuid=f2de61ba-4b20-445c-bdbb-44bb79fb58c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a08840be-e8e9-48ba-9780-050a896d2732", "address": "fa:16:3e:a2:ff:78", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08840be-e8", "ovs_interfaceid": "a08840be-e8e9-48ba-9780-050a896d2732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.818 183087 DEBUG nova.network.os_vif_util [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converting VIF {"id": "a08840be-e8e9-48ba-9780-050a896d2732", "address": "fa:16:3e:a2:ff:78", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08840be-e8", "ovs_interfaceid": "a08840be-e8e9-48ba-9780-050a896d2732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.819 183087 DEBUG nova.network.os_vif_util [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:ff:78,bridge_name='br-int',has_traffic_filtering=True,id=a08840be-e8e9-48ba-9780-050a896d2732,network=Network(410ad2c8-60c1-40d5-855c-7deeb749f0fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa08840be-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.819 183087 DEBUG os_vif [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:ff:78,bridge_name='br-int',has_traffic_filtering=True,id=a08840be-e8e9-48ba-9780-050a896d2732,network=Network(410ad2c8-60c1-40d5-855c-7deeb749f0fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa08840be-e8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.820 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.820 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.821 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.823 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.823 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa08840be-e8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.824 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa08840be-e8, col_values=(('external_ids', {'iface-id': 'a08840be-e8e9-48ba-9780-050a896d2732', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a2:ff:78', 'vm-uuid': 'f2de61ba-4b20-445c-bdbb-44bb79fb58c3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.826 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:33 compute-1 NetworkManager[55451]: <info>  [1769417193.8273] manager: (tapa08840be-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.828 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.834 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.835 183087 INFO os_vif [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:ff:78,bridge_name='br-int',has_traffic_filtering=True,id=a08840be-e8e9-48ba-9780-050a896d2732,network=Network(410ad2c8-60c1-40d5-855c-7deeb749f0fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa08840be-e8')
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.894 183087 DEBUG nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.895 183087 DEBUG nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.895 183087 DEBUG nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] No VIF found with MAC fa:16:3e:09:0a:fa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.895 183087 DEBUG nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] No VIF found with MAC fa:16:3e:a2:ff:78, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 08:46:33 compute-1 nova_compute[183083]: 2026-01-26 08:46:33.896 183087 INFO nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Using config drive
Jan 26 08:46:34 compute-1 nova_compute[183083]: 2026-01-26 08:46:34.443 183087 INFO nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Creating config drive at /var/lib/nova/instances/f2de61ba-4b20-445c-bdbb-44bb79fb58c3/disk.config
Jan 26 08:46:34 compute-1 nova_compute[183083]: 2026-01-26 08:46:34.450 183087 DEBUG oslo_concurrency.processutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f2de61ba-4b20-445c-bdbb-44bb79fb58c3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpihuxabwy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:46:34 compute-1 nova_compute[183083]: 2026-01-26 08:46:34.557 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:34 compute-1 nova_compute[183083]: 2026-01-26 08:46:34.590 183087 DEBUG oslo_concurrency.processutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f2de61ba-4b20-445c-bdbb-44bb79fb58c3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpihuxabwy" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:46:34 compute-1 kernel: tapdd638f7c-8f: entered promiscuous mode
Jan 26 08:46:34 compute-1 NetworkManager[55451]: <info>  [1769417194.6797] manager: (tapdd638f7c-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/47)
Jan 26 08:46:34 compute-1 ovn_controller[95352]: 2026-01-26T08:46:34Z|00096|binding|INFO|Claiming lport dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd for this chassis.
Jan 26 08:46:34 compute-1 nova_compute[183083]: 2026-01-26 08:46:34.686 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:34 compute-1 ovn_controller[95352]: 2026-01-26T08:46:34Z|00097|binding|INFO|dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd: Claiming fa:16:3e:09:0a:fa 192.168.0.164
Jan 26 08:46:34 compute-1 ovn_controller[95352]: 2026-01-26T08:46:34Z|00098|binding|INFO|dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd: Claiming unknown
Jan 26 08:46:34 compute-1 ovn_controller[95352]: 2026-01-26T08:46:34Z|00099|binding|INFO|Setting lport dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd ovn-installed in OVS
Jan 26 08:46:34 compute-1 ovn_controller[95352]: 2026-01-26T08:46:34Z|00100|binding|INFO|Setting lport dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd up in Southbound
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.704 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:0a:fa 192.168.0.164', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.164/24', 'neutron:device_id': 'f2de61ba-4b20-445c-bdbb-44bb79fb58c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bad39ade-29c7-41d5-89dd-fc1845e5f3f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4a559c36b13649d98b2995c099340eb9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7264cb73-6ef7-4995-bc02-8c0dee738bd8, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:46:34 compute-1 NetworkManager[55451]: <info>  [1769417194.7056] manager: (tapa08840be-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Jan 26 08:46:34 compute-1 kernel: tapa08840be-e8: entered promiscuous mode
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.708 104632 INFO neutron.agent.ovn.metadata.agent [-] Port dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd in datapath bad39ade-29c7-41d5-89dd-fc1845e5f3f2 bound to our chassis
Jan 26 08:46:34 compute-1 nova_compute[183083]: 2026-01-26 08:46:34.706 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:34 compute-1 ovn_controller[95352]: 2026-01-26T08:46:34Z|00101|binding|INFO|Claiming lport a08840be-e8e9-48ba-9780-050a896d2732 for this chassis.
Jan 26 08:46:34 compute-1 ovn_controller[95352]: 2026-01-26T08:46:34Z|00102|binding|INFO|a08840be-e8e9-48ba-9780-050a896d2732: Claiming fa:16:3e:a2:ff:78 192.168.1.12
Jan 26 08:46:34 compute-1 ovn_controller[95352]: 2026-01-26T08:46:34Z|00103|binding|INFO|a08840be-e8e9-48ba-9780-050a896d2732: Claiming unknown
Jan 26 08:46:34 compute-1 nova_compute[183083]: 2026-01-26 08:46:34.713 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.716 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bad39ade-29c7-41d5-89dd-fc1845e5f3f2
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.727 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:ff:78 192.168.1.12', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.1.12/24', 'neutron:device_id': 'f2de61ba-4b20-445c-bdbb-44bb79fb58c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-410ad2c8-60c1-40d5-855c-7deeb749f0fe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4a559c36b13649d98b2995c099340eb9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db99b112-04a6-4be6-8e9e-7db1f7ce0209, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=a08840be-e8e9-48ba-9780-050a896d2732) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:46:34 compute-1 ovn_controller[95352]: 2026-01-26T08:46:34Z|00104|binding|INFO|Setting lport a08840be-e8e9-48ba-9780-050a896d2732 ovn-installed in OVS
Jan 26 08:46:34 compute-1 ovn_controller[95352]: 2026-01-26T08:46:34Z|00105|binding|INFO|Setting lport a08840be-e8e9-48ba-9780-050a896d2732 up in Southbound
Jan 26 08:46:34 compute-1 nova_compute[183083]: 2026-01-26 08:46:34.732 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:34 compute-1 systemd-udevd[213882]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:46:34 compute-1 systemd-udevd[213883]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.745 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[d353a55e-e40f-461f-997f-696ba3ccf73c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:34 compute-1 NetworkManager[55451]: <info>  [1769417194.7596] device (tapa08840be-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 08:46:34 compute-1 NetworkManager[55451]: <info>  [1769417194.7607] device (tapa08840be-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 08:46:34 compute-1 NetworkManager[55451]: <info>  [1769417194.7612] device (tapdd638f7c-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 08:46:34 compute-1 NetworkManager[55451]: <info>  [1769417194.7632] device (tapdd638f7c-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 08:46:34 compute-1 systemd-machined[154360]: New machine qemu-5-instance-00000012.
Jan 26 08:46:34 compute-1 systemd[1]: Started Virtual Machine qemu-5-instance-00000012.
Jan 26 08:46:34 compute-1 podman[213860]: 2026-01-26 08:46:34.788865571 +0000 UTC m=+0.110383707 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.791 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[e6841895-4a15-4bd2-a118-1d8c98928ee5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.794 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[9098865d-c494-4be8-a957-0fa8b6e2f9ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.823 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[fe2a7ed4-6a2d-45d8-b16e-fa94aecd04d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.841 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[9a9c9d71-af5e-41dc-a82b-03bedcb751f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbad39ade-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:84:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 15, 'tx_packets': 5, 'rx_bytes': 1022, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 15, 'tx_packets': 5, 'rx_bytes': 1022, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347684, 'reachable_time': 21006, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 10, 'inoctets': 672, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 10, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 672, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 10, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213908, 'error': None, 'target': 'ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.859 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[bd885e68-be62-4117-893a-4b3bee804839]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapbad39ade-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347698, 'tstamp': 347698}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213909, 'error': None, 'target': 'ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tapbad39ade-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347701, 'tstamp': 347701}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213909, 'error': None, 'target': 'ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.861 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbad39ade-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:34 compute-1 nova_compute[183083]: 2026-01-26 08:46:34.863 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:34 compute-1 nova_compute[183083]: 2026-01-26 08:46:34.864 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.865 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbad39ade-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.866 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.867 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbad39ade-20, col_values=(('external_ids', {'iface-id': '809259ab-8ea4-4909-92b4-4ee536a51482'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.868 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.869 104632 INFO neutron.agent.ovn.metadata.agent [-] Port a08840be-e8e9-48ba-9780-050a896d2732 in datapath 410ad2c8-60c1-40d5-855c-7deeb749f0fe unbound from our chassis
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.875 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 410ad2c8-60c1-40d5-855c-7deeb749f0fe
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.893 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[952e242b-0143-4c82-99fb-f3434a0ea960]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.938 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[1d6d8f3b-157f-491c-aafd-9771e2f8f93d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.941 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[a1d198b6-3d98-4904-bfdd-8df8fe4def6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:34.981 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[77cf2187-9944-4425-8fcd-836e6700d64b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:35.007 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[f8967391-e2a6-4f7c-9c11-4bd88b86c910]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap410ad2c8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:bf:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 16, 'tx_packets': 6, 'rx_bytes': 1108, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 16, 'tx_packets': 6, 'rx_bytes': 1108, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347787, 'reachable_time': 26714, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 11, 'inoctets': 744, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 11, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 744, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 11, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213916, 'error': None, 'target': 'ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:35.040 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[0dd20adf-c66f-43c6-8aec-1cc962161d18]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.1.1'], ['IFA_LOCAL', '192.168.1.1'], ['IFA_BROADCAST', '192.168.1.255'], ['IFA_LABEL', 'tap410ad2c8-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347803, 'tstamp': 347803}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213917, 'error': None, 'target': 'ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap410ad2c8-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347808, 'tstamp': 347808}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213917, 'error': None, 'target': 'ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:35.043 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap410ad2c8-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.044 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.046 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:35.046 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap410ad2c8-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:35.046 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:35.047 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap410ad2c8-60, col_values=(('external_ids', {'iface-id': 'bfb9744a-58f0-4145-9e28-9c13225b3407'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:35.047 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.164 183087 DEBUG nova.network.neutron [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Updating instance_info_cache with network_info: [{"id": "9a499fec-d7c8-40bf-a913-c25d3ee8a2fc", "address": "fa:16:3e:0e:ab:41", "network": {"id": "c1d7226c-03c9-4435-8d74-80a043aa071e", "bridge": "br-int", "label": "tempest-test-network--1841496745", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3694415e0ac483fa070e7316b146fc1", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a499fec-d7", "ovs_interfaceid": "9a499fec-d7c8-40bf-a913-c25d3ee8a2fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.191 183087 DEBUG oslo_concurrency.lockutils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Releasing lock "refresh_cache-d6141ab1-a933-4709-8f5b-f95abb9902ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.191 183087 DEBUG nova.compute.manager [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Instance network_info: |[{"id": "9a499fec-d7c8-40bf-a913-c25d3ee8a2fc", "address": "fa:16:3e:0e:ab:41", "network": {"id": "c1d7226c-03c9-4435-8d74-80a043aa071e", "bridge": "br-int", "label": "tempest-test-network--1841496745", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3694415e0ac483fa070e7316b146fc1", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a499fec-d7", "ovs_interfaceid": "9a499fec-d7c8-40bf-a913-c25d3ee8a2fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.192 183087 DEBUG oslo_concurrency.lockutils [req-546b703d-4f48-46be-8aad-708bc069c82f req-d16a9b23-63c5-444e-a177-46b8094c531e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-d6141ab1-a933-4709-8f5b-f95abb9902ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.192 183087 DEBUG nova.network.neutron [req-546b703d-4f48-46be-8aad-708bc069c82f req-d16a9b23-63c5-444e-a177-46b8094c531e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Refreshing network info cache for port 9a499fec-d7c8-40bf-a913-c25d3ee8a2fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.193 183087 INFO nova.compute.manager [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Terminating instance
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.195 183087 DEBUG nova.compute.manager [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.199 183087 DEBUG nova.virt.libvirt.driver [-] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.199 183087 INFO nova.virt.libvirt.driver [-] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Instance destroyed successfully.
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.200 183087 DEBUG nova.virt.libvirt.vif [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:46:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-160822683',display_name='tempest-server-test-160822683',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-160822683',id=19,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMOAo9U7CQ6+MUYlZi/t9zlH/+rAp796cJ8WJs24Medi+Z15tpQDat3ArX4YiDk1KZuy6uq3/eoJDuJSWU3b/ZFp6PXjDw49aWrQKlvfg9FnwF0SFbwb28a5Q4WXO/+Jdg==',key_name='tempest-keypair-test-1873335547',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a3694415e0ac483fa070e7316b146fc1',ramdisk_id='',reservation_id='r-lgtyhrux',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestOvn-2033283083',owner_user_name='tempest-QosTestOvn-2033283083-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:46:22Z,user_data=None,user_id='add713470fcc438f95ec0ff89dbb2adc',uuid=d6141ab1-a933-4709-8f5b-f95abb9902ff,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9a499fec-d7c8-40bf-a913-c25d3ee8a2fc", "address": "fa:16:3e:0e:ab:41", "network": {"id": "c1d7226c-03c9-4435-8d74-80a043aa071e", "bridge": "br-int", "label": "tempest-test-network--1841496745", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3694415e0ac483fa070e7316b146fc1", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a499fec-d7", "ovs_interfaceid": "9a499fec-d7c8-40bf-a913-c25d3ee8a2fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.200 183087 DEBUG nova.network.os_vif_util [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Converting VIF {"id": "9a499fec-d7c8-40bf-a913-c25d3ee8a2fc", "address": "fa:16:3e:0e:ab:41", "network": {"id": "c1d7226c-03c9-4435-8d74-80a043aa071e", "bridge": "br-int", "label": "tempest-test-network--1841496745", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3694415e0ac483fa070e7316b146fc1", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a499fec-d7", "ovs_interfaceid": "9a499fec-d7c8-40bf-a913-c25d3ee8a2fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.201 183087 DEBUG nova.network.os_vif_util [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:ab:41,bridge_name='br-int',has_traffic_filtering=True,id=9a499fec-d7c8-40bf-a913-c25d3ee8a2fc,network=Network(c1d7226c-03c9-4435-8d74-80a043aa071e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a499fec-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.202 183087 DEBUG os_vif [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:ab:41,bridge_name='br-int',has_traffic_filtering=True,id=9a499fec-d7c8-40bf-a913-c25d3ee8a2fc,network=Network(c1d7226c-03c9-4435-8d74-80a043aa071e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a499fec-d7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.203 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.204 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9a499fec-d7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.204 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.206 183087 INFO os_vif [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:ab:41,bridge_name='br-int',has_traffic_filtering=True,id=9a499fec-d7c8-40bf-a913-c25d3ee8a2fc,network=Network(c1d7226c-03c9-4435-8d74-80a043aa071e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a499fec-d7')
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.207 183087 INFO nova.virt.libvirt.driver [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Deleting instance files /var/lib/nova/instances/d6141ab1-a933-4709-8f5b-f95abb9902ff_del
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.207 183087 INFO nova.virt.libvirt.driver [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Deletion of /var/lib/nova/instances/d6141ab1-a933-4709-8f5b-f95abb9902ff_del complete
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.256 183087 INFO nova.compute.manager [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Took 0.06 seconds to destroy the instance on the hypervisor.
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.258 183087 DEBUG nova.compute.claims [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c982a5e50> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.259 183087 DEBUG oslo_concurrency.lockutils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.259 183087 DEBUG oslo_concurrency.lockutils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.315 183087 DEBUG nova.compute.manager [req-4cfcdcc3-e464-4653-af1f-0214a8419ed3 req-8d7617f3-4881-498f-b800-69b9efd23073 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Received event network-vif-plugged-dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.316 183087 DEBUG oslo_concurrency.lockutils [req-4cfcdcc3-e464-4653-af1f-0214a8419ed3 req-8d7617f3-4881-498f-b800-69b9efd23073 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.316 183087 DEBUG oslo_concurrency.lockutils [req-4cfcdcc3-e464-4653-af1f-0214a8419ed3 req-8d7617f3-4881-498f-b800-69b9efd23073 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.317 183087 DEBUG oslo_concurrency.lockutils [req-4cfcdcc3-e464-4653-af1f-0214a8419ed3 req-8d7617f3-4881-498f-b800-69b9efd23073 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.317 183087 DEBUG nova.compute.manager [req-4cfcdcc3-e464-4653-af1f-0214a8419ed3 req-8d7617f3-4881-498f-b800-69b9efd23073 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Processing event network-vif-plugged-dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.439 183087 DEBUG nova.compute.provider_tree [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.458 183087 DEBUG nova.scheduler.client.report [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.472 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417195.4724357, f2de61ba-4b20-445c-bdbb-44bb79fb58c3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.473 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] VM Started (Lifecycle Event)
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.490 183087 DEBUG oslo_concurrency.lockutils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.491 183087 DEBUG nova.compute.utils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.491 183087 ERROR nova.compute.manager [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Build of instance d6141ab1-a933-4709-8f5b-f95abb9902ff aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance d6141ab1-a933-4709-8f5b-f95abb9902ff aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.492 183087 DEBUG nova.compute.manager [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.492 183087 DEBUG nova.virt.libvirt.vif [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:46:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-160822683',display_name='tempest-server-test-160822683',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-server-test-160822683',id=19,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMOAo9U7CQ6+MUYlZi/t9zlH/+rAp796cJ8WJs24Medi+Z15tpQDat3ArX4YiDk1KZuy6uq3/eoJDuJSWU3b/ZFp6PXjDw49aWrQKlvfg9FnwF0SFbwb28a5Q4WXO/+Jdg==',key_name='tempest-keypair-test-1873335547',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a3694415e0ac483fa070e7316b146fc1',ramdisk_id='',reservation_id='r-lgtyhrux',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestOvn-2033283083',owner_user_name='tempest-QosTestOvn-2033283083-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:46:35Z,user_data=None,user_id='add713470fcc438f95ec0ff89dbb2adc',uuid=d6141ab1-a933-4709-8f5b-f95abb9902ff,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9a499fec-d7c8-40bf-a913-c25d3ee8a2fc", "address": "fa:16:3e:0e:ab:41", "network": {"id": "c1d7226c-03c9-4435-8d74-80a043aa071e", "bridge": "br-int", "label": "tempest-test-network--1841496745", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3694415e0ac483fa070e7316b146fc1", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a499fec-d7", "ovs_interfaceid": "9a499fec-d7c8-40bf-a913-c25d3ee8a2fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.493 183087 DEBUG nova.network.os_vif_util [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Converting VIF {"id": "9a499fec-d7c8-40bf-a913-c25d3ee8a2fc", "address": "fa:16:3e:0e:ab:41", "network": {"id": "c1d7226c-03c9-4435-8d74-80a043aa071e", "bridge": "br-int", "label": "tempest-test-network--1841496745", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3694415e0ac483fa070e7316b146fc1", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a499fec-d7", "ovs_interfaceid": "9a499fec-d7c8-40bf-a913-c25d3ee8a2fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.493 183087 DEBUG nova.network.os_vif_util [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:ab:41,bridge_name='br-int',has_traffic_filtering=True,id=9a499fec-d7c8-40bf-a913-c25d3ee8a2fc,network=Network(c1d7226c-03c9-4435-8d74-80a043aa071e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a499fec-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.493 183087 DEBUG os_vif [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:ab:41,bridge_name='br-int',has_traffic_filtering=True,id=9a499fec-d7c8-40bf-a913-c25d3ee8a2fc,network=Network(c1d7226c-03c9-4435-8d74-80a043aa071e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a499fec-d7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.495 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.495 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9a499fec-d7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.495 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.496 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.497 183087 INFO os_vif [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:ab:41,bridge_name='br-int',has_traffic_filtering=True,id=9a499fec-d7c8-40bf-a913-c25d3ee8a2fc,network=Network(c1d7226c-03c9-4435-8d74-80a043aa071e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a499fec-d7')
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.497 183087 DEBUG nova.compute.manager [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.497 183087 DEBUG nova.compute.manager [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.498 183087 DEBUG nova.network.neutron [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.501 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417195.472549, f2de61ba-4b20-445c-bdbb-44bb79fb58c3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.501 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] VM Paused (Lifecycle Event)
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.525 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.528 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:46:35 compute-1 nova_compute[183083]: 2026-01-26 08:46:35.555 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:46:36 compute-1 nova_compute[183083]: 2026-01-26 08:46:36.170 183087 DEBUG nova.network.neutron [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:36 compute-1 nova_compute[183083]: 2026-01-26 08:46:36.194 183087 INFO nova.compute.manager [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Took 0.70 seconds to deallocate network for instance.
Jan 26 08:46:36 compute-1 nova_compute[183083]: 2026-01-26 08:46:36.483 183087 INFO nova.scheduler.client.report [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Deleted allocations for instance d6141ab1-a933-4709-8f5b-f95abb9902ff
Jan 26 08:46:36 compute-1 nova_compute[183083]: 2026-01-26 08:46:36.485 183087 DEBUG oslo_concurrency.lockutils [None req-2c50b739-a3de-4d39-b7b0-7aff564b7e3f add713470fcc438f95ec0ff89dbb2adc a3694415e0ac483fa070e7316b146fc1 - - default default] Lock "d6141ab1-a933-4709-8f5b-f95abb9902ff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.221 183087 DEBUG nova.network.neutron [req-68778ab1-2d36-4512-812c-a62fbf62c8a2 req-111b8619-5742-4f13-a54e-9a99f44759b9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Updated VIF entry in instance network info cache for port a08840be-e8e9-48ba-9780-050a896d2732. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.222 183087 DEBUG nova.network.neutron [req-68778ab1-2d36-4512-812c-a62fbf62c8a2 req-111b8619-5742-4f13-a54e-9a99f44759b9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Updating instance_info_cache with network_info: [{"id": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "address": "fa:16:3e:09:0a:fa", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.164", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd638f7c-8f", "ovs_interfaceid": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "a08840be-e8e9-48ba-9780-050a896d2732", "address": "fa:16:3e:a2:ff:78", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08840be-e8", "ovs_interfaceid": "a08840be-e8e9-48ba-9780-050a896d2732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.246 183087 DEBUG oslo_concurrency.lockutils [req-68778ab1-2d36-4512-812c-a62fbf62c8a2 req-111b8619-5742-4f13-a54e-9a99f44759b9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-f2de61ba-4b20-445c-bdbb-44bb79fb58c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.678 183087 DEBUG nova.compute.manager [req-f5fe2cac-5079-412d-bb14-d9fee12d8fa0 req-4c08b5bf-6da2-4a4d-993d-3b31916205a8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Received event network-vif-plugged-dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.679 183087 DEBUG oslo_concurrency.lockutils [req-f5fe2cac-5079-412d-bb14-d9fee12d8fa0 req-4c08b5bf-6da2-4a4d-993d-3b31916205a8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.679 183087 DEBUG oslo_concurrency.lockutils [req-f5fe2cac-5079-412d-bb14-d9fee12d8fa0 req-4c08b5bf-6da2-4a4d-993d-3b31916205a8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.680 183087 DEBUG oslo_concurrency.lockutils [req-f5fe2cac-5079-412d-bb14-d9fee12d8fa0 req-4c08b5bf-6da2-4a4d-993d-3b31916205a8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.680 183087 DEBUG nova.compute.manager [req-f5fe2cac-5079-412d-bb14-d9fee12d8fa0 req-4c08b5bf-6da2-4a4d-993d-3b31916205a8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] No event matching network-vif-plugged-dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd in dict_keys([('network-vif-plugged', 'a08840be-e8e9-48ba-9780-050a896d2732')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.681 183087 WARNING nova.compute.manager [req-f5fe2cac-5079-412d-bb14-d9fee12d8fa0 req-4c08b5bf-6da2-4a4d-993d-3b31916205a8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Received unexpected event network-vif-plugged-dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd for instance with vm_state building and task_state spawning.
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.681 183087 DEBUG nova.compute.manager [req-f5fe2cac-5079-412d-bb14-d9fee12d8fa0 req-4c08b5bf-6da2-4a4d-993d-3b31916205a8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Received event network-vif-plugged-a08840be-e8e9-48ba-9780-050a896d2732 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.682 183087 DEBUG oslo_concurrency.lockutils [req-f5fe2cac-5079-412d-bb14-d9fee12d8fa0 req-4c08b5bf-6da2-4a4d-993d-3b31916205a8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.683 183087 DEBUG oslo_concurrency.lockutils [req-f5fe2cac-5079-412d-bb14-d9fee12d8fa0 req-4c08b5bf-6da2-4a4d-993d-3b31916205a8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.683 183087 DEBUG oslo_concurrency.lockutils [req-f5fe2cac-5079-412d-bb14-d9fee12d8fa0 req-4c08b5bf-6da2-4a4d-993d-3b31916205a8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.684 183087 DEBUG nova.compute.manager [req-f5fe2cac-5079-412d-bb14-d9fee12d8fa0 req-4c08b5bf-6da2-4a4d-993d-3b31916205a8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Processing event network-vif-plugged-a08840be-e8e9-48ba-9780-050a896d2732 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.685 183087 DEBUG nova.compute.manager [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Instance event wait completed in 2 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.692 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417197.6924088, f2de61ba-4b20-445c-bdbb-44bb79fb58c3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.693 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] VM Resumed (Lifecycle Event)
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.694 183087 DEBUG nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.699 183087 INFO nova.virt.libvirt.driver [-] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Instance spawned successfully.
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.699 183087 DEBUG nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.731 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.735 183087 DEBUG nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.736 183087 DEBUG nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.736 183087 DEBUG nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.736 183087 DEBUG nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.737 183087 DEBUG nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.737 183087 DEBUG nova.virt.libvirt.driver [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.741 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.777 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.824 183087 INFO nova.compute.manager [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Took 18.28 seconds to spawn the instance on the hypervisor.
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.824 183087 DEBUG nova.compute.manager [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.898 183087 INFO nova.compute.manager [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Took 18.85 seconds to build instance.
Jan 26 08:46:37 compute-1 nova_compute[183083]: 2026-01-26 08:46:37.914 183087 DEBUG oslo_concurrency.lockutils [None req-d9e0f216-077c-4237-8de1-cdc2f101fed9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.937s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:38 compute-1 nova_compute[183083]: 2026-01-26 08:46:38.827 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:38 compute-1 nova_compute[183083]: 2026-01-26 08:46:38.849 183087 DEBUG nova.network.neutron [req-546b703d-4f48-46be-8aad-708bc069c82f req-d16a9b23-63c5-444e-a177-46b8094c531e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Updated VIF entry in instance network info cache for port 9a499fec-d7c8-40bf-a913-c25d3ee8a2fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:46:38 compute-1 nova_compute[183083]: 2026-01-26 08:46:38.850 183087 DEBUG nova.network.neutron [req-546b703d-4f48-46be-8aad-708bc069c82f req-d16a9b23-63c5-444e-a177-46b8094c531e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d6141ab1-a933-4709-8f5b-f95abb9902ff] Updating instance_info_cache with network_info: [{"id": "9a499fec-d7c8-40bf-a913-c25d3ee8a2fc", "address": "fa:16:3e:0e:ab:41", "network": {"id": "c1d7226c-03c9-4435-8d74-80a043aa071e", "bridge": "br-int", "label": "tempest-test-network--1841496745", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3694415e0ac483fa070e7316b146fc1", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a499fec-d7", "ovs_interfaceid": "9a499fec-d7c8-40bf-a913-c25d3ee8a2fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:38 compute-1 nova_compute[183083]: 2026-01-26 08:46:38.876 183087 DEBUG oslo_concurrency.lockutils [req-546b703d-4f48-46be-8aad-708bc069c82f req-d16a9b23-63c5-444e-a177-46b8094c531e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-d6141ab1-a933-4709-8f5b-f95abb9902ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:46:39 compute-1 nova_compute[183083]: 2026-01-26 08:46:39.129 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:39 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:39.127 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:46:39 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:39.128 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:46:39 compute-1 nova_compute[183083]: 2026-01-26 08:46:39.559 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:39 compute-1 nova_compute[183083]: 2026-01-26 08:46:39.877 183087 DEBUG nova.compute.manager [req-16d29eaf-41db-4289-a3ee-e10109740201 req-008b2e58-0f2b-4a0f-8e80-963d39b52adf 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Received event network-vif-plugged-a08840be-e8e9-48ba-9780-050a896d2732 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:46:39 compute-1 nova_compute[183083]: 2026-01-26 08:46:39.877 183087 DEBUG oslo_concurrency.lockutils [req-16d29eaf-41db-4289-a3ee-e10109740201 req-008b2e58-0f2b-4a0f-8e80-963d39b52adf 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:39 compute-1 nova_compute[183083]: 2026-01-26 08:46:39.878 183087 DEBUG oslo_concurrency.lockutils [req-16d29eaf-41db-4289-a3ee-e10109740201 req-008b2e58-0f2b-4a0f-8e80-963d39b52adf 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:39 compute-1 nova_compute[183083]: 2026-01-26 08:46:39.878 183087 DEBUG oslo_concurrency.lockutils [req-16d29eaf-41db-4289-a3ee-e10109740201 req-008b2e58-0f2b-4a0f-8e80-963d39b52adf 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:39 compute-1 nova_compute[183083]: 2026-01-26 08:46:39.878 183087 DEBUG nova.compute.manager [req-16d29eaf-41db-4289-a3ee-e10109740201 req-008b2e58-0f2b-4a0f-8e80-963d39b52adf 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] No waiting events found dispatching network-vif-plugged-a08840be-e8e9-48ba-9780-050a896d2732 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:46:39 compute-1 nova_compute[183083]: 2026-01-26 08:46:39.878 183087 WARNING nova.compute.manager [req-16d29eaf-41db-4289-a3ee-e10109740201 req-008b2e58-0f2b-4a0f-8e80-963d39b52adf 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Received unexpected event network-vif-plugged-a08840be-e8e9-48ba-9780-050a896d2732 for instance with vm_state active and task_state None.
Jan 26 08:46:39 compute-1 nova_compute[183083]: 2026-01-26 08:46:39.936 183087 INFO nova.compute.manager [None req-d3893a24-b638-45c0-8dec-6af1799a656d 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Get console output
Jan 26 08:46:39 compute-1 nova_compute[183083]: 2026-01-26 08:46:39.940 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:46:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:40.131 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:40 compute-1 nova_compute[183083]: 2026-01-26 08:46:40.437 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:42 compute-1 sshd-session[213926]: Connection closed by authenticating user root 159.223.236.81 port 56718 [preauth]
Jan 26 08:46:43 compute-1 nova_compute[183083]: 2026-01-26 08:46:43.830 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:44 compute-1 nova_compute[183083]: 2026-01-26 08:46:44.563 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.101 183087 DEBUG oslo_concurrency.lockutils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquiring lock "7e96739c-7a11-441e-b8cd-282ed2718e81" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.101 183087 DEBUG oslo_concurrency.lockutils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "7e96739c-7a11-441e-b8cd-282ed2718e81" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.133 183087 DEBUG nova.compute.manager [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.178 183087 INFO nova.compute.manager [None req-44f4a8ae-45fe-4df0-a262-393cef60a3ab 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Get console output
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.184 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.218 183087 DEBUG oslo_concurrency.lockutils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.219 183087 DEBUG oslo_concurrency.lockutils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.226 183087 DEBUG nova.virt.hardware [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.226 183087 INFO nova.compute.claims [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.291 183087 DEBUG oslo_concurrency.lockutils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "25dc96cb-bdec-4220-933c-db906d35e033" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.291 183087 DEBUG oslo_concurrency.lockutils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "25dc96cb-bdec-4220-933c-db906d35e033" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.316 183087 DEBUG nova.compute.manager [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.412 183087 DEBUG oslo_concurrency.lockutils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.434 183087 DEBUG nova.compute.provider_tree [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.449 183087 DEBUG nova.scheduler.client.report [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.475 183087 DEBUG oslo_concurrency.lockutils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.475 183087 DEBUG nova.compute.manager [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.477 183087 DEBUG oslo_concurrency.lockutils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.065s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.483 183087 DEBUG nova.virt.hardware [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.483 183087 INFO nova.compute.claims [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.546 183087 DEBUG nova.compute.manager [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.547 183087 DEBUG nova.network.neutron [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.582 183087 INFO nova.virt.libvirt.driver [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.598 183087 DEBUG nova.compute.manager [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.676 183087 DEBUG nova.compute.provider_tree [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.693 183087 DEBUG nova.compute.manager [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.694 183087 DEBUG nova.virt.libvirt.driver [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.695 183087 INFO nova.virt.libvirt.driver [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Creating image(s)
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.696 183087 DEBUG oslo_concurrency.lockutils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquiring lock "/var/lib/nova/instances/7e96739c-7a11-441e-b8cd-282ed2718e81/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.697 183087 DEBUG oslo_concurrency.lockutils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "/var/lib/nova/instances/7e96739c-7a11-441e-b8cd-282ed2718e81/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.697 183087 DEBUG oslo_concurrency.lockutils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "/var/lib/nova/instances/7e96739c-7a11-441e-b8cd-282ed2718e81/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.698 183087 DEBUG oslo_concurrency.lockutils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.699 183087 DEBUG oslo_concurrency.lockutils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.705 183087 DEBUG nova.scheduler.client.report [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.730 183087 DEBUG oslo_concurrency.lockutils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.730 183087 DEBUG nova.compute.manager [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.774 183087 DEBUG nova.compute.manager [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.775 183087 DEBUG nova.network.neutron [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.793 183087 INFO nova.virt.libvirt.driver [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.812 183087 DEBUG nova.compute.manager [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.927 183087 DEBUG nova.compute.manager [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.929 183087 DEBUG nova.virt.libvirt.driver [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.929 183087 INFO nova.virt.libvirt.driver [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Creating image(s)
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.930 183087 DEBUG oslo_concurrency.lockutils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "/var/lib/nova/instances/25dc96cb-bdec-4220-933c-db906d35e033/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.931 183087 DEBUG oslo_concurrency.lockutils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "/var/lib/nova/instances/25dc96cb-bdec-4220-933c-db906d35e033/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.932 183087 DEBUG oslo_concurrency.lockutils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "/var/lib/nova/instances/25dc96cb-bdec-4220-933c-db906d35e033/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:45 compute-1 nova_compute[183083]: 2026-01-26 08:46:45.932 183087 DEBUG oslo_concurrency.lockutils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:46 compute-1 nova_compute[183083]: 2026-01-26 08:46:46.083 183087 DEBUG nova.policy [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a7abeebb4e4d469c91e6cee77f6be1c3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b71ae2b9d2fd454b8b3b9aa1a0e5c7e4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:46:46 compute-1 nova_compute[183083]: 2026-01-26 08:46:46.141 183087 DEBUG nova.policy [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '41a09f4d7f034b1c85f20c9512d33411', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71cced1777f24868932d789154ff04a0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.015 183087 DEBUG nova.network.neutron [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Successfully created port: d3bcb06d-6162-48f9-b729-6d1c166209d0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 DEBUG oslo_concurrency.lockutils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.952s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Traceback (most recent call last):
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]     raise exception.ImageUnacceptable(
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] 
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] During handling of the above exception, another exception occurred:
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] 
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Traceback (most recent call last):
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]     yield resources
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]     created_disks = self._create_and_inject_local_root(
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]     image.cache(fetch_func=fetch_func,
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]     return f(*args, **kwargs)
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81]     raise exception.ImageUnacceptable(
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.651 183087 ERROR nova.compute.manager [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] 
Jan 26 08:46:47 compute-1 nova_compute[183083]: 2026-01-26 08:46:47.653 183087 DEBUG oslo_concurrency.lockutils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 1.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:47 compute-1 podman[213928]: 2026-01-26 08:46:47.971472278 +0000 UTC m=+0.079153557 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 26 08:46:47 compute-1 podman[213929]: 2026-01-26 08:46:47.977167399 +0000 UTC m=+0.084819217 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, name=ubi9-minimal, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, architecture=x86_64, version=9.6)
Jan 26 08:46:48 compute-1 nova_compute[183083]: 2026-01-26 08:46:48.478 183087 DEBUG nova.network.neutron [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Successfully updated port: d3bcb06d-6162-48f9-b729-6d1c166209d0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:46:48 compute-1 nova_compute[183083]: 2026-01-26 08:46:48.498 183087 DEBUG oslo_concurrency.lockutils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "refresh_cache-25dc96cb-bdec-4220-933c-db906d35e033" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:46:48 compute-1 nova_compute[183083]: 2026-01-26 08:46:48.499 183087 DEBUG oslo_concurrency.lockutils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquired lock "refresh_cache-25dc96cb-bdec-4220-933c-db906d35e033" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:46:48 compute-1 nova_compute[183083]: 2026-01-26 08:46:48.499 183087 DEBUG nova.network.neutron [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:46:48 compute-1 nova_compute[183083]: 2026-01-26 08:46:48.800 183087 DEBUG nova.network.neutron [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:46:48 compute-1 nova_compute[183083]: 2026-01-26 08:46:48.876 183087 DEBUG nova.network.neutron [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Successfully updated port: 6109dbde-e9d0-48fa-ba7a-b9858052f668 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:46:48 compute-1 nova_compute[183083]: 2026-01-26 08:46:48.878 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:48 compute-1 nova_compute[183083]: 2026-01-26 08:46:48.905 183087 DEBUG oslo_concurrency.lockutils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquiring lock "refresh_cache-7e96739c-7a11-441e-b8cd-282ed2718e81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:46:48 compute-1 nova_compute[183083]: 2026-01-26 08:46:48.906 183087 DEBUG oslo_concurrency.lockutils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquired lock "refresh_cache-7e96739c-7a11-441e-b8cd-282ed2718e81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:46:48 compute-1 nova_compute[183083]: 2026-01-26 08:46:48.906 183087 DEBUG nova.network.neutron [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.035 183087 DEBUG nova.compute.manager [req-91d4bb76-6595-41a2-af94-926833bd1689 req-a04e5e34-6702-4a59-8c30-ecc39a851663 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Received event network-changed-d3bcb06d-6162-48f9-b729-6d1c166209d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.035 183087 DEBUG nova.compute.manager [req-91d4bb76-6595-41a2-af94-926833bd1689 req-a04e5e34-6702-4a59-8c30-ecc39a851663 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Refreshing instance network info cache due to event network-changed-d3bcb06d-6162-48f9-b729-6d1c166209d0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.035 183087 DEBUG oslo_concurrency.lockutils [req-91d4bb76-6595-41a2-af94-926833bd1689 req-a04e5e34-6702-4a59-8c30-ecc39a851663 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-25dc96cb-bdec-4220-933c-db906d35e033" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.193 183087 DEBUG nova.compute.manager [req-bc034280-0a4f-456e-aa0d-555fa376c6d5 req-a641cb3f-34ec-499f-b9ab-3fb8cec5bdde 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Received event network-changed-6109dbde-e9d0-48fa-ba7a-b9858052f668 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.193 183087 DEBUG nova.compute.manager [req-bc034280-0a4f-456e-aa0d-555fa376c6d5 req-a641cb3f-34ec-499f-b9ab-3fb8cec5bdde 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Refreshing instance network info cache due to event network-changed-6109dbde-e9d0-48fa-ba7a-b9858052f668. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.194 183087 DEBUG oslo_concurrency.lockutils [req-bc034280-0a4f-456e-aa0d-555fa376c6d5 req-a641cb3f-34ec-499f-b9ab-3fb8cec5bdde 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-7e96739c-7a11-441e-b8cd-282ed2718e81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:46:49 compute-1 ovn_controller[95352]: 2026-01-26T08:46:49Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a2:ff:78 192.168.1.12
Jan 26 08:46:49 compute-1 ovn_controller[95352]: 2026-01-26T08:46:49Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a2:ff:78 192.168.1.12
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.257 183087 DEBUG nova.network.neutron [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 DEBUG oslo_concurrency.lockutils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.639s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Traceback (most recent call last):
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]     raise exception.ImageUnacceptable(
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033] 
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033] During handling of the above exception, another exception occurred:
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033] 
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Traceback (most recent call last):
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]     yield resources
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]     created_disks = self._create_and_inject_local_root(
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]     image.cache(fetch_func=fetch_func,
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]     return f(*args, **kwargs)
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033]     raise exception.ImageUnacceptable(
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.292 183087 ERROR nova.compute.manager [instance: 25dc96cb-bdec-4220-933c-db906d35e033] 
Jan 26 08:46:49 compute-1 nova_compute[183083]: 2026-01-26 08:46:49.566 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:49 compute-1 ovn_controller[95352]: 2026-01-26T08:46:49Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:09:0a:fa 192.168.0.164
Jan 26 08:46:49 compute-1 ovn_controller[95352]: 2026-01-26T08:46:49Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:09:0a:fa 192.168.0.164
Jan 26 08:46:49 compute-1 sshd-session[213977]: Accepted publickey for zuul from 38.102.83.66 port 33944 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:46:49 compute-1 systemd-logind[788]: New session 34 of user zuul.
Jan 26 08:46:49 compute-1 systemd[1]: Started Session 34 of User zuul.
Jan 26 08:46:49 compute-1 sshd-session[213977]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:46:50 compute-1 sshd-session[213980]: Connection closed by 38.102.83.66 port 33944
Jan 26 08:46:50 compute-1 sshd-session[213977]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:46:50 compute-1 systemd[1]: session-34.scope: Deactivated successfully.
Jan 26 08:46:50 compute-1 systemd-logind[788]: Session 34 logged out. Waiting for processes to exit.
Jan 26 08:46:50 compute-1 systemd-logind[788]: Removed session 34.
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.183 183087 DEBUG nova.network.neutron [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Updating instance_info_cache with network_info: [{"id": "d3bcb06d-6162-48f9-b729-6d1c166209d0", "address": "fa:16:3e:e8:e4:1e", "network": {"id": "d68df36a-1bed-4c39-81f5-d85208215efc", "bridge": "br-int", "label": "tempest-test-network--392260359", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.211", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3bcb06d-61", "ovs_interfaceid": "d3bcb06d-6162-48f9-b729-6d1c166209d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.210 183087 DEBUG oslo_concurrency.lockutils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Releasing lock "refresh_cache-25dc96cb-bdec-4220-933c-db906d35e033" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.210 183087 DEBUG nova.compute.manager [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Instance network_info: |[{"id": "d3bcb06d-6162-48f9-b729-6d1c166209d0", "address": "fa:16:3e:e8:e4:1e", "network": {"id": "d68df36a-1bed-4c39-81f5-d85208215efc", "bridge": "br-int", "label": "tempest-test-network--392260359", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.211", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3bcb06d-61", "ovs_interfaceid": "d3bcb06d-6162-48f9-b729-6d1c166209d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.211 183087 DEBUG oslo_concurrency.lockutils [req-91d4bb76-6595-41a2-af94-926833bd1689 req-a04e5e34-6702-4a59-8c30-ecc39a851663 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-25dc96cb-bdec-4220-933c-db906d35e033" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.211 183087 DEBUG nova.network.neutron [req-91d4bb76-6595-41a2-af94-926833bd1689 req-a04e5e34-6702-4a59-8c30-ecc39a851663 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Refreshing network info cache for port d3bcb06d-6162-48f9-b729-6d1c166209d0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.212 183087 INFO nova.compute.manager [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Terminating instance
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.214 183087 DEBUG nova.compute.manager [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.218 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.218 183087 INFO nova.virt.libvirt.driver [-] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Instance destroyed successfully.
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.219 183087 DEBUG nova.virt.libvirt.vif [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:46:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_marking_east_west-864444150',display_name='tempest-test_dscp_marking_east_west-864444150',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-marking-east-west-864444150',id=21,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHVjp+2pOh+xYUkttf/EHrrYAH3LBOn+IKLzf3fiQpiaJslqkY+OmJn6bfd2cX/NEPdTL45qAcY0Zt6OwZRQXbHCoOcvnydr7uXjZCoGXOxoNL1bEhwXU4AaOmmyDzyYAA==',key_name='tempest-keypair-test-1026532318',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b71ae2b9d2fd454b8b3b9aa1a0e5c7e4',ramdisk_id='',reservation_id='r-ovqkayyr',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-374727467',owner_user_name='tempest-QosTestCommon-374727467-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:46:45Z,user_data=None,user_id='a7abeebb4e4d469c91e6cee77f6be1c3',uuid=25dc96cb-bdec-4220-933c-db906d35e033,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d3bcb06d-6162-48f9-b729-6d1c166209d0", "address": "fa:16:3e:e8:e4:1e", "network": {"id": "d68df36a-1bed-4c39-81f5-d85208215efc", "bridge": "br-int", "label": "tempest-test-network--392260359", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.211", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3bcb06d-61", "ovs_interfaceid": "d3bcb06d-6162-48f9-b729-6d1c166209d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.219 183087 DEBUG nova.network.os_vif_util [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converting VIF {"id": "d3bcb06d-6162-48f9-b729-6d1c166209d0", "address": "fa:16:3e:e8:e4:1e", "network": {"id": "d68df36a-1bed-4c39-81f5-d85208215efc", "bridge": "br-int", "label": "tempest-test-network--392260359", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.211", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3bcb06d-61", "ovs_interfaceid": "d3bcb06d-6162-48f9-b729-6d1c166209d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.220 183087 DEBUG nova.network.os_vif_util [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:e4:1e,bridge_name='br-int',has_traffic_filtering=True,id=d3bcb06d-6162-48f9-b729-6d1c166209d0,network=Network(d68df36a-1bed-4c39-81f5-d85208215efc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3bcb06d-61') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.221 183087 DEBUG os_vif [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:e4:1e,bridge_name='br-int',has_traffic_filtering=True,id=d3bcb06d-6162-48f9-b729-6d1c166209d0,network=Network(d68df36a-1bed-4c39-81f5-d85208215efc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3bcb06d-61') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.223 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.223 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3bcb06d-61, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.224 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.227 183087 INFO os_vif [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:e4:1e,bridge_name='br-int',has_traffic_filtering=True,id=d3bcb06d-6162-48f9-b729-6d1c166209d0,network=Network(d68df36a-1bed-4c39-81f5-d85208215efc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3bcb06d-61')
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.227 183087 INFO nova.virt.libvirt.driver [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Deleting instance files /var/lib/nova/instances/25dc96cb-bdec-4220-933c-db906d35e033_del
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.228 183087 INFO nova.virt.libvirt.driver [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Deletion of /var/lib/nova/instances/25dc96cb-bdec-4220-933c-db906d35e033_del complete
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.291 183087 INFO nova.compute.manager [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Took 0.08 seconds to destroy the instance on the hypervisor.
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.292 183087 DEBUG nova.compute.claims [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c9849fcd0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.293 183087 DEBUG oslo_concurrency.lockutils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.293 183087 DEBUG oslo_concurrency.lockutils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.440 183087 DEBUG nova.compute.provider_tree [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.454 183087 DEBUG nova.scheduler.client.report [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.476 183087 DEBUG oslo_concurrency.lockutils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.477 183087 DEBUG nova.compute.utils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.477 183087 ERROR nova.compute.manager [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Build of instance 25dc96cb-bdec-4220-933c-db906d35e033 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 25dc96cb-bdec-4220-933c-db906d35e033 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.478 183087 DEBUG nova.compute.manager [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.479 183087 DEBUG nova.virt.libvirt.vif [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:46:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_marking_east_west-864444150',display_name='tempest-test_dscp_marking_east_west-864444150',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-test-dscp-marking-east-west-864444150',id=21,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHVjp+2pOh+xYUkttf/EHrrYAH3LBOn+IKLzf3fiQpiaJslqkY+OmJn6bfd2cX/NEPdTL45qAcY0Zt6OwZRQXbHCoOcvnydr7uXjZCoGXOxoNL1bEhwXU4AaOmmyDzyYAA==',key_name='tempest-keypair-test-1026532318',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b71ae2b9d2fd454b8b3b9aa1a0e5c7e4',ramdisk_id='',reservation_id='r-ovqkayyr',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-374727467',owner_user_name='tempest-QosTestCommon-374727467-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:46:50Z,user_data=None,user_id='a7abeebb4e4d469c91e6cee77f6be1c3',uuid=25dc96cb-bdec-4220-933c-db906d35e033,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d3bcb06d-6162-48f9-b729-6d1c166209d0", "address": "fa:16:3e:e8:e4:1e", "network": {"id": "d68df36a-1bed-4c39-81f5-d85208215efc", "bridge": "br-int", "label": "tempest-test-network--392260359", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.211", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3bcb06d-61", "ovs_interfaceid": "d3bcb06d-6162-48f9-b729-6d1c166209d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.479 183087 DEBUG nova.network.os_vif_util [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converting VIF {"id": "d3bcb06d-6162-48f9-b729-6d1c166209d0", "address": "fa:16:3e:e8:e4:1e", "network": {"id": "d68df36a-1bed-4c39-81f5-d85208215efc", "bridge": "br-int", "label": "tempest-test-network--392260359", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.211", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3bcb06d-61", "ovs_interfaceid": "d3bcb06d-6162-48f9-b729-6d1c166209d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.480 183087 DEBUG nova.network.os_vif_util [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:e4:1e,bridge_name='br-int',has_traffic_filtering=True,id=d3bcb06d-6162-48f9-b729-6d1c166209d0,network=Network(d68df36a-1bed-4c39-81f5-d85208215efc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3bcb06d-61') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.480 183087 DEBUG os_vif [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:e4:1e,bridge_name='br-int',has_traffic_filtering=True,id=d3bcb06d-6162-48f9-b729-6d1c166209d0,network=Network(d68df36a-1bed-4c39-81f5-d85208215efc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3bcb06d-61') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.481 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.482 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3bcb06d-61, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.482 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.485 183087 INFO os_vif [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:e4:1e,bridge_name='br-int',has_traffic_filtering=True,id=d3bcb06d-6162-48f9-b729-6d1c166209d0,network=Network(d68df36a-1bed-4c39-81f5-d85208215efc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3bcb06d-61')
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.485 183087 DEBUG nova.compute.manager [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.486 183087 DEBUG nova.compute.manager [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.486 183087 DEBUG nova.network.neutron [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:46:50 compute-1 ovn_controller[95352]: 2026-01-26T08:46:50Z|00106|binding|INFO|Releasing lport 809259ab-8ea4-4909-92b4-4ee536a51482 from this chassis (sb_readonly=0)
Jan 26 08:46:50 compute-1 ovn_controller[95352]: 2026-01-26T08:46:50Z|00107|binding|INFO|Releasing lport bfb9744a-58f0-4145-9e28-9c13225b3407 from this chassis (sb_readonly=0)
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.682 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.932 183087 INFO nova.compute.manager [None req-91224b48-97eb-424a-855e-ff7fbcce908c 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Get console output
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.939 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 26 08:46:50 compute-1 nova_compute[183083]: 2026-01-26 08:46:50.969 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.923 183087 DEBUG nova.network.neutron [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Updating instance_info_cache with network_info: [{"id": "6109dbde-e9d0-48fa-ba7a-b9858052f668", "address": "fa:16:3e:77:04:f2", "network": {"id": "1cdf657e-d1f0-4974-87c2-d94f12081012", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::112", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}, {"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.120", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6109dbde-e9", "ovs_interfaceid": "6109dbde-e9d0-48fa-ba7a-b9858052f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.949 183087 DEBUG oslo_concurrency.lockutils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Releasing lock "refresh_cache-7e96739c-7a11-441e-b8cd-282ed2718e81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.950 183087 DEBUG nova.compute.manager [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Instance network_info: |[{"id": "6109dbde-e9d0-48fa-ba7a-b9858052f668", "address": "fa:16:3e:77:04:f2", "network": {"id": "1cdf657e-d1f0-4974-87c2-d94f12081012", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::112", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}, {"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.120", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6109dbde-e9", "ovs_interfaceid": "6109dbde-e9d0-48fa-ba7a-b9858052f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.950 183087 DEBUG oslo_concurrency.lockutils [req-bc034280-0a4f-456e-aa0d-555fa376c6d5 req-a641cb3f-34ec-499f-b9ab-3fb8cec5bdde 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-7e96739c-7a11-441e-b8cd-282ed2718e81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.951 183087 DEBUG nova.network.neutron [req-bc034280-0a4f-456e-aa0d-555fa376c6d5 req-a641cb3f-34ec-499f-b9ab-3fb8cec5bdde 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Refreshing network info cache for port 6109dbde-e9d0-48fa-ba7a-b9858052f668 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.952 183087 INFO nova.compute.manager [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Terminating instance
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.954 183087 DEBUG nova.compute.manager [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.958 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.959 183087 INFO nova.virt.libvirt.driver [-] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Instance destroyed successfully.
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.960 183087 DEBUG nova.virt.libvirt.vif [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:46:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_ipv4_ipv6_stateful',display_name='tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_ipv4_ipv6_stateful',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-ovnextradhcpoptionstest-154566894-test-extra-dhcp-opts',id=20,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlGW+ytSlZhYgiyJ2VQXbYJFNaJudegvaO8arsRLbrIovURYus5bHVwpbhRN5uNXuMSQSnvVfbQff2hzPqkCc7D0963u0+MNy7sXLIRMgXOJArypDGQ7Qj3yNlotCHO0A==',key_name='tempest-OvnExtraDhcpOptionsTest-154566894',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71cced1777f24868932d789154ff04a0',ramdisk_id='',reservation_id='r-hou7azyr',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-OvnExtraDhcpOptionsTest-1195747792',owner_user_name='tempest-OvnExtraDhcpOptionsTest-1195747792-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:46:45Z,user_data=None,user_id='41a09f4d7f034b1c85f20c9512d33411',uuid=7e96739c-7a11-441e-b8cd-282ed2718e81,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6109dbde-e9d0-48fa-ba7a-b9858052f668", "address": "fa:16:3e:77:04:f2", "network": {"id": "1cdf657e-d1f0-4974-87c2-d94f12081012", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::112", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}, {"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.120", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6109dbde-e9", "ovs_interfaceid": "6109dbde-e9d0-48fa-ba7a-b9858052f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.960 183087 DEBUG nova.network.os_vif_util [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Converting VIF {"id": "6109dbde-e9d0-48fa-ba7a-b9858052f668", "address": "fa:16:3e:77:04:f2", "network": {"id": "1cdf657e-d1f0-4974-87c2-d94f12081012", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::112", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}, {"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.120", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6109dbde-e9", "ovs_interfaceid": "6109dbde-e9d0-48fa-ba7a-b9858052f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.961 183087 DEBUG nova.network.os_vif_util [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:04:f2,bridge_name='br-int',has_traffic_filtering=True,id=6109dbde-e9d0-48fa-ba7a-b9858052f668,network=Network(1cdf657e-d1f0-4974-87c2-d94f12081012),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6109dbde-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.961 183087 DEBUG os_vif [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:04:f2,bridge_name='br-int',has_traffic_filtering=True,id=6109dbde-e9d0-48fa-ba7a-b9858052f668,network=Network(1cdf657e-d1f0-4974-87c2-d94f12081012),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6109dbde-e9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.963 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.963 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6109dbde-e9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.963 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.965 183087 INFO os_vif [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:04:f2,bridge_name='br-int',has_traffic_filtering=True,id=6109dbde-e9d0-48fa-ba7a-b9858052f668,network=Network(1cdf657e-d1f0-4974-87c2-d94f12081012),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6109dbde-e9')
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.967 183087 INFO nova.virt.libvirt.driver [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Deleting instance files /var/lib/nova/instances/7e96739c-7a11-441e-b8cd-282ed2718e81_del
Jan 26 08:46:52 compute-1 nova_compute[183083]: 2026-01-26 08:46:52.967 183087 INFO nova.virt.libvirt.driver [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Deletion of /var/lib/nova/instances/7e96739c-7a11-441e-b8cd-282ed2718e81_del complete
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.021 183087 INFO nova.compute.manager [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Took 0.07 seconds to destroy the instance on the hypervisor.
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.022 183087 DEBUG nova.compute.claims [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c985114f0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.023 183087 DEBUG oslo_concurrency.lockutils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.023 183087 DEBUG oslo_concurrency.lockutils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.198 183087 DEBUG nova.compute.provider_tree [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.215 183087 DEBUG nova.scheduler.client.report [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.247 183087 DEBUG oslo_concurrency.lockutils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.248 183087 DEBUG nova.compute.utils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.248 183087 ERROR nova.compute.manager [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Build of instance 7e96739c-7a11-441e-b8cd-282ed2718e81 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 7e96739c-7a11-441e-b8cd-282ed2718e81 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.249 183087 DEBUG nova.compute.manager [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.249 183087 DEBUG nova.virt.libvirt.vif [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:46:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_ipv4_ipv6_stateful',display_name='tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_ipv4_ipv6_stateful',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-ovnextradhcpoptionstest-154566894-test-extra-dhcp-opts',id=20,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlGW+ytSlZhYgiyJ2VQXbYJFNaJudegvaO8arsRLbrIovURYus5bHVwpbhRN5uNXuMSQSnvVfbQff2hzPqkCc7D0963u0+MNy7sXLIRMgXOJArypDGQ7Qj3yNlotCHO0A==',key_name='tempest-OvnExtraDhcpOptionsTest-154566894',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71cced1777f24868932d789154ff04a0',ramdisk_id='',reservation_id='r-hou7azyr',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-OvnExtraDhcpOptionsTest-1195747792',owner_user_name='tempest-OvnExtraDhcpOptionsTest-1195747792-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:46:52Z,user_data=None,user_id='41a09f4d7f034b1c85f20c9512d33411',uuid=7e96739c-7a11-441e-b8cd-282ed2718e81,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6109dbde-e9d0-48fa-ba7a-b9858052f668", "address": "fa:16:3e:77:04:f2", "network": {"id": "1cdf657e-d1f0-4974-87c2-d94f12081012", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::112", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}, {"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.120", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6109dbde-e9", "ovs_interfaceid": "6109dbde-e9d0-48fa-ba7a-b9858052f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.250 183087 DEBUG nova.network.os_vif_util [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Converting VIF {"id": "6109dbde-e9d0-48fa-ba7a-b9858052f668", "address": "fa:16:3e:77:04:f2", "network": {"id": "1cdf657e-d1f0-4974-87c2-d94f12081012", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::112", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}, {"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.120", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6109dbde-e9", "ovs_interfaceid": "6109dbde-e9d0-48fa-ba7a-b9858052f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.250 183087 DEBUG nova.network.os_vif_util [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:04:f2,bridge_name='br-int',has_traffic_filtering=True,id=6109dbde-e9d0-48fa-ba7a-b9858052f668,network=Network(1cdf657e-d1f0-4974-87c2-d94f12081012),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6109dbde-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.250 183087 DEBUG os_vif [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:04:f2,bridge_name='br-int',has_traffic_filtering=True,id=6109dbde-e9d0-48fa-ba7a-b9858052f668,network=Network(1cdf657e-d1f0-4974-87c2-d94f12081012),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6109dbde-e9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.251 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.252 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6109dbde-e9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.252 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.254 183087 INFO os_vif [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:04:f2,bridge_name='br-int',has_traffic_filtering=True,id=6109dbde-e9d0-48fa-ba7a-b9858052f668,network=Network(1cdf657e-d1f0-4974-87c2-d94f12081012),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6109dbde-e9')
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.254 183087 DEBUG nova.compute.manager [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.254 183087 DEBUG nova.compute.manager [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.255 183087 DEBUG nova.network.neutron [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.756 183087 DEBUG nova.network.neutron [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.774 183087 INFO nova.compute.manager [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Took 3.29 seconds to deallocate network for instance.
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.882 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.943 183087 INFO nova.scheduler.client.report [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Deleted allocations for instance 25dc96cb-bdec-4220-933c-db906d35e033
Jan 26 08:46:53 compute-1 nova_compute[183083]: 2026-01-26 08:46:53.944 183087 DEBUG oslo_concurrency.lockutils [None req-c0645b68-f9ff-4b38-9131-97f85feabb25 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "25dc96cb-bdec-4220-933c-db906d35e033" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:54 compute-1 nova_compute[183083]: 2026-01-26 08:46:54.045 183087 DEBUG nova.network.neutron [req-91d4bb76-6595-41a2-af94-926833bd1689 req-a04e5e34-6702-4a59-8c30-ecc39a851663 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Updated VIF entry in instance network info cache for port d3bcb06d-6162-48f9-b729-6d1c166209d0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:46:54 compute-1 nova_compute[183083]: 2026-01-26 08:46:54.046 183087 DEBUG nova.network.neutron [req-91d4bb76-6595-41a2-af94-926833bd1689 req-a04e5e34-6702-4a59-8c30-ecc39a851663 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 25dc96cb-bdec-4220-933c-db906d35e033] Updating instance_info_cache with network_info: [{"id": "d3bcb06d-6162-48f9-b729-6d1c166209d0", "address": "fa:16:3e:e8:e4:1e", "network": {"id": "d68df36a-1bed-4c39-81f5-d85208215efc", "bridge": "br-int", "label": "tempest-test-network--392260359", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.211", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3bcb06d-61", "ovs_interfaceid": "d3bcb06d-6162-48f9-b729-6d1c166209d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:54 compute-1 nova_compute[183083]: 2026-01-26 08:46:54.063 183087 DEBUG oslo_concurrency.lockutils [req-91d4bb76-6595-41a2-af94-926833bd1689 req-a04e5e34-6702-4a59-8c30-ecc39a851663 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-25dc96cb-bdec-4220-933c-db906d35e033" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:46:54 compute-1 nova_compute[183083]: 2026-01-26 08:46:54.592 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:54 compute-1 nova_compute[183083]: 2026-01-26 08:46:54.834 183087 DEBUG nova.network.neutron [req-bc034280-0a4f-456e-aa0d-555fa376c6d5 req-a641cb3f-34ec-499f-b9ab-3fb8cec5bdde 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Updated VIF entry in instance network info cache for port 6109dbde-e9d0-48fa-ba7a-b9858052f668. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:46:54 compute-1 nova_compute[183083]: 2026-01-26 08:46:54.834 183087 DEBUG nova.network.neutron [req-bc034280-0a4f-456e-aa0d-555fa376c6d5 req-a641cb3f-34ec-499f-b9ab-3fb8cec5bdde 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Updating instance_info_cache with network_info: [{"id": "6109dbde-e9d0-48fa-ba7a-b9858052f668", "address": "fa:16:3e:77:04:f2", "network": {"id": "1cdf657e-d1f0-4974-87c2-d94f12081012", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-154566894-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::112", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}, {"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.120", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71cced1777f24868932d789154ff04a0", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6109dbde-e9", "ovs_interfaceid": "6109dbde-e9d0-48fa-ba7a-b9858052f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:54 compute-1 nova_compute[183083]: 2026-01-26 08:46:54.850 183087 DEBUG oslo_concurrency.lockutils [req-bc034280-0a4f-456e-aa0d-555fa376c6d5 req-a641cb3f-34ec-499f-b9ab-3fb8cec5bdde 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-7e96739c-7a11-441e-b8cd-282ed2718e81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:46:54 compute-1 podman[214004]: 2026-01-26 08:46:54.853955468 +0000 UTC m=+0.107616554 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 26 08:46:54 compute-1 podman[214005]: 2026-01-26 08:46:54.859898817 +0000 UTC m=+0.111231557 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 08:46:56 compute-1 podman[214051]: 2026-01-26 08:46:56.827273719 +0000 UTC m=+0.077332876 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:46:56 compute-1 nova_compute[183083]: 2026-01-26 08:46:56.921 183087 DEBUG nova.network.neutron [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:56 compute-1 nova_compute[183083]: 2026-01-26 08:46:56.961 183087 INFO nova.compute.manager [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] [instance: 7e96739c-7a11-441e-b8cd-282ed2718e81] Took 3.71 seconds to deallocate network for instance.
Jan 26 08:46:56 compute-1 nova_compute[183083]: 2026-01-26 08:46:56.969 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:46:56 compute-1 nova_compute[183083]: 2026-01-26 08:46:56.970 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:46:57 compute-1 nova_compute[183083]: 2026-01-26 08:46:57.219 183087 INFO nova.scheduler.client.report [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Deleted allocations for instance 7e96739c-7a11-441e-b8cd-282ed2718e81
Jan 26 08:46:57 compute-1 nova_compute[183083]: 2026-01-26 08:46:57.220 183087 DEBUG oslo_concurrency.lockutils [None req-8b08f687-b6df-4786-949e-04b24d61932d 41a09f4d7f034b1c85f20c9512d33411 71cced1777f24868932d789154ff04a0 - - default default] Lock "7e96739c-7a11-441e-b8cd-282ed2718e81" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:57 compute-1 nova_compute[183083]: 2026-01-26 08:46:57.537 183087 DEBUG nova.compute.manager [req-90364428-272c-4bc5-b411-84fd009d1362 req-79f66bcd-4a55-46c2-a9d1-c617ad8206a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Received event network-changed-dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:46:57 compute-1 nova_compute[183083]: 2026-01-26 08:46:57.537 183087 DEBUG nova.compute.manager [req-90364428-272c-4bc5-b411-84fd009d1362 req-79f66bcd-4a55-46c2-a9d1-c617ad8206a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Refreshing instance network info cache due to event network-changed-dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:46:57 compute-1 nova_compute[183083]: 2026-01-26 08:46:57.538 183087 DEBUG oslo_concurrency.lockutils [req-90364428-272c-4bc5-b411-84fd009d1362 req-79f66bcd-4a55-46c2-a9d1-c617ad8206a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-f2de61ba-4b20-445c-bdbb-44bb79fb58c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:46:57 compute-1 nova_compute[183083]: 2026-01-26 08:46:57.538 183087 DEBUG oslo_concurrency.lockutils [req-90364428-272c-4bc5-b411-84fd009d1362 req-79f66bcd-4a55-46c2-a9d1-c617ad8206a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-f2de61ba-4b20-445c-bdbb-44bb79fb58c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:46:57 compute-1 nova_compute[183083]: 2026-01-26 08:46:57.538 183087 DEBUG nova.network.neutron [req-90364428-272c-4bc5-b411-84fd009d1362 req-79f66bcd-4a55-46c2-a9d1-c617ad8206a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Refreshing network info cache for port dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:46:57 compute-1 ovn_controller[95352]: 2026-01-26T08:46:57Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a2:ff:78 192.168.1.12
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.333 183087 DEBUG oslo_concurrency.lockutils [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.334 183087 DEBUG oslo_concurrency.lockutils [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.334 183087 DEBUG oslo_concurrency.lockutils [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.335 183087 DEBUG oslo_concurrency.lockutils [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.335 183087 DEBUG oslo_concurrency.lockutils [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.336 183087 INFO nova.compute.manager [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Terminating instance
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.337 183087 DEBUG nova.compute.manager [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:46:58 compute-1 kernel: tapdd638f7c-8f (unregistering): left promiscuous mode
Jan 26 08:46:58 compute-1 NetworkManager[55451]: <info>  [1769417218.3728] device (tapdd638f7c-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 08:46:58 compute-1 ovn_controller[95352]: 2026-01-26T08:46:58Z|00108|binding|INFO|Releasing lport dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd from this chassis (sb_readonly=0)
Jan 26 08:46:58 compute-1 ovn_controller[95352]: 2026-01-26T08:46:58Z|00109|binding|INFO|Setting lport dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd down in Southbound
Jan 26 08:46:58 compute-1 ovn_controller[95352]: 2026-01-26T08:46:58Z|00110|binding|INFO|Removing iface tapdd638f7c-8f ovn-installed in OVS
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.434 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.440 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:0a:fa 192.168.0.164', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.164/24', 'neutron:device_id': 'f2de61ba-4b20-445c-bdbb-44bb79fb58c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bad39ade-29c7-41d5-89dd-fc1845e5f3f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4a559c36b13649d98b2995c099340eb9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7264cb73-6ef7-4995-bc02-8c0dee738bd8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.441 104632 INFO neutron.agent.ovn.metadata.agent [-] Port dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd in datapath bad39ade-29c7-41d5-89dd-fc1845e5f3f2 unbound from our chassis
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.444 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bad39ade-29c7-41d5-89dd-fc1845e5f3f2
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.453 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:58 compute-1 kernel: tapa08840be-e8 (unregistering): left promiscuous mode
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.460 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[17bc17b5-d3bb-4b2e-94e5-08233eaebb01]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:58 compute-1 NetworkManager[55451]: <info>  [1769417218.4635] device (tapa08840be-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 08:46:58 compute-1 ovn_controller[95352]: 2026-01-26T08:46:58Z|00111|binding|INFO|Releasing lport a08840be-e8e9-48ba-9780-050a896d2732 from this chassis (sb_readonly=0)
Jan 26 08:46:58 compute-1 ovn_controller[95352]: 2026-01-26T08:46:58Z|00112|binding|INFO|Setting lport a08840be-e8e9-48ba-9780-050a896d2732 down in Southbound
Jan 26 08:46:58 compute-1 ovn_controller[95352]: 2026-01-26T08:46:58Z|00113|binding|INFO|Removing iface tapa08840be-e8 ovn-installed in OVS
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.473 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.474 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.482 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:ff:78 192.168.1.12', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.1.12/24', 'neutron:device_id': 'f2de61ba-4b20-445c-bdbb-44bb79fb58c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-410ad2c8-60c1-40d5-855c-7deeb749f0fe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4a559c36b13649d98b2995c099340eb9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db99b112-04a6-4be6-8e9e-7db1f7ce0209, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=a08840be-e8e9-48ba-9780-050a896d2732) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.488 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.496 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[f3dcb55f-faa9-4a3a-8bd4-c277e99c64b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.498 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[d2f6aba9-c21b-408c-a720-b0bee93697bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:58 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000012.scope: Deactivated successfully.
Jan 26 08:46:58 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000012.scope: Consumed 12.903s CPU time.
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.516 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[332341c4-a250-4416-9d1b-764bbdf776a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:58 compute-1 systemd-machined[154360]: Machine qemu-5-instance-00000012 terminated.
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.527 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.530 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[be29af5d-c8b9-48b6-a133-342bc02351c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbad39ade-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:84:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 22, 'tx_packets': 7, 'rx_bytes': 1372, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 22, 'tx_packets': 7, 'rx_bytes': 1372, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347684, 'reachable_time': 21006, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 12, 'inoctets': 784, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 12, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 784, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 12, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214088, 'error': None, 'target': 'ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.541 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b864f9d7-ff08-4b2d-b246-96d01ca8e8ce]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapbad39ade-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347698, 'tstamp': 347698}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214089, 'error': None, 'target': 'ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tapbad39ade-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347701, 'tstamp': 347701}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214089, 'error': None, 'target': 'ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.542 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbad39ade-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.544 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.550 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.550 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbad39ade-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.551 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.551 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbad39ade-20, col_values=(('external_ids', {'iface-id': '809259ab-8ea4-4909-92b4-4ee536a51482'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.551 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.553 104632 INFO neutron.agent.ovn.metadata.agent [-] Port a08840be-e8e9-48ba-9780-050a896d2732 in datapath 410ad2c8-60c1-40d5-855c-7deeb749f0fe unbound from our chassis
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.555 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 410ad2c8-60c1-40d5-855c-7deeb749f0fe
Jan 26 08:46:58 compute-1 NetworkManager[55451]: <info>  [1769417218.5725] manager: (tapa08840be-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.575 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[431a66bc-284d-499d-9d57-68877a0df2cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.609 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[3c94747c-9404-4b3c-829c-de822e37da80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.613 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[9e529ec5-e515-444d-aa47-0cdb8d8bcaed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.612 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Triggering sync for uuid 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.612 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Triggering sync for uuid f2de61ba-4b20-445c-bdbb-44bb79fb58c3 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.612 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.613 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.613 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.613 183087 INFO nova.virt.libvirt.driver [-] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Instance destroyed successfully.
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.614 183087 DEBUG nova.objects.instance [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lazy-loading 'resources' on Instance uuid f2de61ba-4b20-445c-bdbb-44bb79fb58c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.639 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[fe52e17f-f8c9-45da-8d95-9cda568f9b5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.655 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[10252257-79c9-4257-82b7-83b7d698541c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap410ad2c8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:bf:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 27, 'tx_packets': 8, 'rx_bytes': 1654, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 27, 'tx_packets': 8, 'rx_bytes': 1654, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347787, 'reachable_time': 26714, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 14, 'inoctets': 912, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 14, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 912, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 14, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214124, 'error': None, 'target': 'ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.662 183087 DEBUG nova.virt.libvirt.vif [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T08:46:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1494741620',display_name='tempest-server-test-1494741620',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1494741620',id=18,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBONV6jGPWIKrQk26JKZ8H9h2iNqZUCOpmao7Jq+9fEiq3iYmdJdZreCUr9V3PbbF1TPAOou07OOnfHkvbrEzcfNM5ieiMGZPqHbawuPIe3wilad9S814UZ1oxvh/DW+nZg==',key_name='tempest-keypair-test-867467517',keypairs=<?>,launch_index=0,launched_at=2026-01-26T08:46:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4a559c36b13649d98b2995c099340eb9',ramdisk_id='',reservation_id='r-bo90ukz1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-PortSecurityTest-508365101',owner_user_name='tempest-PortSecurityTest-508365101-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T08:46:37Z,user_data=None,user_id='52d582094c584036ba3e04c9da69ee02',uuid=f2de61ba-4b20-445c-bdbb-44bb79fb58c3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "address": "fa:16:3e:09:0a:fa", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.164", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd638f7c-8f", "ovs_interfaceid": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.663 183087 DEBUG nova.network.os_vif_util [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converting VIF {"id": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "address": "fa:16:3e:09:0a:fa", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.164", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd638f7c-8f", "ovs_interfaceid": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.663 183087 DEBUG nova.network.os_vif_util [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:0a:fa,bridge_name='br-int',has_traffic_filtering=True,id=dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd,network=Network(bad39ade-29c7-41d5-89dd-fc1845e5f3f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdd638f7c-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.664 183087 DEBUG os_vif [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:0a:fa,bridge_name='br-int',has_traffic_filtering=True,id=dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd,network=Network(bad39ade-29c7-41d5-89dd-fc1845e5f3f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdd638f7c-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.665 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.665 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd638f7c-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.667 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.054s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.667 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.668 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.670 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.672 183087 INFO os_vif [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:0a:fa,bridge_name='br-int',has_traffic_filtering=True,id=dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd,network=Network(bad39ade-29c7-41d5-89dd-fc1845e5f3f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapdd638f7c-8f')
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.672 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[8c49a967-1655-44bc-b692-e920885fb821]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.1.1'], ['IFA_LOCAL', '192.168.1.1'], ['IFA_BROADCAST', '192.168.1.255'], ['IFA_LABEL', 'tap410ad2c8-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347803, 'tstamp': 347803}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214125, 'error': None, 'target': 'ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap410ad2c8-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347808, 'tstamp': 347808}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214125, 'error': None, 'target': 'ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.673 183087 DEBUG nova.virt.libvirt.vif [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T08:46:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1494741620',display_name='tempest-server-test-1494741620',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1494741620',id=18,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBONV6jGPWIKrQk26JKZ8H9h2iNqZUCOpmao7Jq+9fEiq3iYmdJdZreCUr9V3PbbF1TPAOou07OOnfHkvbrEzcfNM5ieiMGZPqHbawuPIe3wilad9S814UZ1oxvh/DW+nZg==',key_name='tempest-keypair-test-867467517',keypairs=<?>,launch_index=0,launched_at=2026-01-26T08:46:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4a559c36b13649d98b2995c099340eb9',ramdisk_id='',reservation_id='r-bo90ukz1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-PortSecurityTest-508365101',owner_user_name='tempest-PortSecurityTest-508365101-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T08:46:37Z,user_data=None,user_id='52d582094c584036ba3e04c9da69ee02',uuid=f2de61ba-4b20-445c-bdbb-44bb79fb58c3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a08840be-e8e9-48ba-9780-050a896d2732", "address": "fa:16:3e:a2:ff:78", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08840be-e8", "ovs_interfaceid": "a08840be-e8e9-48ba-9780-050a896d2732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.673 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap410ad2c8-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.674 183087 DEBUG nova.network.os_vif_util [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converting VIF {"id": "a08840be-e8e9-48ba-9780-050a896d2732", "address": "fa:16:3e:a2:ff:78", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08840be-e8", "ovs_interfaceid": "a08840be-e8e9-48ba-9780-050a896d2732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.674 183087 DEBUG nova.network.os_vif_util [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:ff:78,bridge_name='br-int',has_traffic_filtering=True,id=a08840be-e8e9-48ba-9780-050a896d2732,network=Network(410ad2c8-60c1-40d5-855c-7deeb749f0fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa08840be-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.675 183087 DEBUG os_vif [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:ff:78,bridge_name='br-int',has_traffic_filtering=True,id=a08840be-e8e9-48ba-9780-050a896d2732,network=Network(410ad2c8-60c1-40d5-855c-7deeb749f0fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa08840be-e8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.675 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.676 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa08840be-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.677 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.678 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.679 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap410ad2c8-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.679 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.679 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap410ad2c8-60, col_values=(('external_ids', {'iface-id': 'bfb9744a-58f0-4145-9e28-9c13225b3407'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.679 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:46:58.680 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.681 183087 INFO os_vif [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:ff:78,bridge_name='br-int',has_traffic_filtering=True,id=a08840be-e8e9-48ba-9780-050a896d2732,network=Network(410ad2c8-60c1-40d5-855c-7deeb749f0fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa08840be-e8')
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.682 183087 INFO nova.virt.libvirt.driver [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Deleting instance files /var/lib/nova/instances/f2de61ba-4b20-445c-bdbb-44bb79fb58c3_del
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.682 183087 INFO nova.virt.libvirt.driver [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Deletion of /var/lib/nova/instances/f2de61ba-4b20-445c-bdbb-44bb79fb58c3_del complete
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.752 183087 INFO nova.compute.manager [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Took 0.41 seconds to destroy the instance on the hypervisor.
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.753 183087 DEBUG oslo.service.loopingcall [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.754 183087 DEBUG nova.compute.manager [-] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.754 183087 DEBUG nova.network.neutron [-] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:46:58 compute-1 nova_compute[183083]: 2026-01-26 08:46:58.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 26 08:46:59 compute-1 nova_compute[183083]: 2026-01-26 08:46:59.218 183087 DEBUG nova.compute.manager [req-15de3165-7154-4199-9d07-7595146cc79f req-c5b8978f-ac23-417e-b9b7-f957824c09d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Received event network-vif-unplugged-dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:46:59 compute-1 nova_compute[183083]: 2026-01-26 08:46:59.218 183087 DEBUG oslo_concurrency.lockutils [req-15de3165-7154-4199-9d07-7595146cc79f req-c5b8978f-ac23-417e-b9b7-f957824c09d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:46:59 compute-1 nova_compute[183083]: 2026-01-26 08:46:59.219 183087 DEBUG oslo_concurrency.lockutils [req-15de3165-7154-4199-9d07-7595146cc79f req-c5b8978f-ac23-417e-b9b7-f957824c09d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:46:59 compute-1 nova_compute[183083]: 2026-01-26 08:46:59.219 183087 DEBUG oslo_concurrency.lockutils [req-15de3165-7154-4199-9d07-7595146cc79f req-c5b8978f-ac23-417e-b9b7-f957824c09d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:46:59 compute-1 nova_compute[183083]: 2026-01-26 08:46:59.220 183087 DEBUG nova.compute.manager [req-15de3165-7154-4199-9d07-7595146cc79f req-c5b8978f-ac23-417e-b9b7-f957824c09d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] No waiting events found dispatching network-vif-unplugged-dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:46:59 compute-1 nova_compute[183083]: 2026-01-26 08:46:59.220 183087 DEBUG nova.compute.manager [req-15de3165-7154-4199-9d07-7595146cc79f req-c5b8978f-ac23-417e-b9b7-f957824c09d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Received event network-vif-unplugged-dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 08:46:59 compute-1 nova_compute[183083]: 2026-01-26 08:46:59.627 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:46:59 compute-1 nova_compute[183083]: 2026-01-26 08:46:59.648 183087 DEBUG nova.network.neutron [req-90364428-272c-4bc5-b411-84fd009d1362 req-79f66bcd-4a55-46c2-a9d1-c617ad8206a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Updated VIF entry in instance network info cache for port dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:46:59 compute-1 nova_compute[183083]: 2026-01-26 08:46:59.649 183087 DEBUG nova.network.neutron [req-90364428-272c-4bc5-b411-84fd009d1362 req-79f66bcd-4a55-46c2-a9d1-c617ad8206a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Updating instance_info_cache with network_info: [{"id": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "address": "fa:16:3e:09:0a:fa", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.164", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd638f7c-8f", "ovs_interfaceid": "dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "a08840be-e8e9-48ba-9780-050a896d2732", "address": "fa:16:3e:a2:ff:78", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa08840be-e8", "ovs_interfaceid": "a08840be-e8e9-48ba-9780-050a896d2732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:46:59 compute-1 nova_compute[183083]: 2026-01-26 08:46:59.683 183087 DEBUG oslo_concurrency.lockutils [req-90364428-272c-4bc5-b411-84fd009d1362 req-79f66bcd-4a55-46c2-a9d1-c617ad8206a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-f2de61ba-4b20-445c-bdbb-44bb79fb58c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:46:59 compute-1 sshd-session[214128]: Accepted publickey for zuul from 38.102.83.66 port 57300 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:46:59 compute-1 systemd-logind[788]: New session 35 of user zuul.
Jan 26 08:46:59 compute-1 systemd[1]: Started Session 35 of User zuul.
Jan 26 08:46:59 compute-1 sshd-session[214128]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:46:59 compute-1 sshd-session[214131]: Connection closed by 38.102.83.66 port 57300
Jan 26 08:46:59 compute-1 sshd-session[214128]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:46:59 compute-1 systemd[1]: session-35.scope: Deactivated successfully.
Jan 26 08:46:59 compute-1 systemd-logind[788]: Session 35 logged out. Waiting for processes to exit.
Jan 26 08:46:59 compute-1 systemd-logind[788]: Removed session 35.
Jan 26 08:46:59 compute-1 nova_compute[183083]: 2026-01-26 08:46:59.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:46:59 compute-1 nova_compute[183083]: 2026-01-26 08:46:59.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:47:00 compute-1 nova_compute[183083]: 2026-01-26 08:47:00.285 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "refresh_cache-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:47:00 compute-1 nova_compute[183083]: 2026-01-26 08:47:00.286 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquired lock "refresh_cache-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:47:00 compute-1 nova_compute[183083]: 2026-01-26 08:47:00.286 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 08:47:01 compute-1 ovn_controller[95352]: 2026-01-26T08:47:01Z|00114|binding|INFO|Releasing lport 809259ab-8ea4-4909-92b4-4ee536a51482 from this chassis (sb_readonly=0)
Jan 26 08:47:01 compute-1 ovn_controller[95352]: 2026-01-26T08:47:01Z|00115|binding|INFO|Releasing lport bfb9744a-58f0-4145-9e28-9c13225b3407 from this chassis (sb_readonly=0)
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.232 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.506 183087 DEBUG nova.compute.manager [req-4f1e19d7-beb5-4654-9af0-f2fe4de9d3c8 req-4b95af5c-934e-4a5a-97c7-637325320241 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Received event network-vif-plugged-dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.507 183087 DEBUG oslo_concurrency.lockutils [req-4f1e19d7-beb5-4654-9af0-f2fe4de9d3c8 req-4b95af5c-934e-4a5a-97c7-637325320241 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.507 183087 DEBUG oslo_concurrency.lockutils [req-4f1e19d7-beb5-4654-9af0-f2fe4de9d3c8 req-4b95af5c-934e-4a5a-97c7-637325320241 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.507 183087 DEBUG oslo_concurrency.lockutils [req-4f1e19d7-beb5-4654-9af0-f2fe4de9d3c8 req-4b95af5c-934e-4a5a-97c7-637325320241 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.507 183087 DEBUG nova.compute.manager [req-4f1e19d7-beb5-4654-9af0-f2fe4de9d3c8 req-4b95af5c-934e-4a5a-97c7-637325320241 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] No waiting events found dispatching network-vif-plugged-dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.507 183087 WARNING nova.compute.manager [req-4f1e19d7-beb5-4654-9af0-f2fe4de9d3c8 req-4b95af5c-934e-4a5a-97c7-637325320241 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Received unexpected event network-vif-plugged-dd638f7c-8f4d-4a2a-b6b8-b9bab731e9fd for instance with vm_state active and task_state deleting.
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.508 183087 DEBUG nova.compute.manager [req-4f1e19d7-beb5-4654-9af0-f2fe4de9d3c8 req-4b95af5c-934e-4a5a-97c7-637325320241 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Received event network-vif-unplugged-a08840be-e8e9-48ba-9780-050a896d2732 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.508 183087 DEBUG oslo_concurrency.lockutils [req-4f1e19d7-beb5-4654-9af0-f2fe4de9d3c8 req-4b95af5c-934e-4a5a-97c7-637325320241 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.508 183087 DEBUG oslo_concurrency.lockutils [req-4f1e19d7-beb5-4654-9af0-f2fe4de9d3c8 req-4b95af5c-934e-4a5a-97c7-637325320241 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.508 183087 DEBUG oslo_concurrency.lockutils [req-4f1e19d7-beb5-4654-9af0-f2fe4de9d3c8 req-4b95af5c-934e-4a5a-97c7-637325320241 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.508 183087 DEBUG nova.compute.manager [req-4f1e19d7-beb5-4654-9af0-f2fe4de9d3c8 req-4b95af5c-934e-4a5a-97c7-637325320241 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] No waiting events found dispatching network-vif-unplugged-a08840be-e8e9-48ba-9780-050a896d2732 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.509 183087 DEBUG nova.compute.manager [req-4f1e19d7-beb5-4654-9af0-f2fe4de9d3c8 req-4b95af5c-934e-4a5a-97c7-637325320241 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Received event network-vif-unplugged-a08840be-e8e9-48ba-9780-050a896d2732 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.509 183087 DEBUG nova.compute.manager [req-4f1e19d7-beb5-4654-9af0-f2fe4de9d3c8 req-4b95af5c-934e-4a5a-97c7-637325320241 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Received event network-vif-plugged-a08840be-e8e9-48ba-9780-050a896d2732 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.509 183087 DEBUG oslo_concurrency.lockutils [req-4f1e19d7-beb5-4654-9af0-f2fe4de9d3c8 req-4b95af5c-934e-4a5a-97c7-637325320241 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.509 183087 DEBUG oslo_concurrency.lockutils [req-4f1e19d7-beb5-4654-9af0-f2fe4de9d3c8 req-4b95af5c-934e-4a5a-97c7-637325320241 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.509 183087 DEBUG oslo_concurrency.lockutils [req-4f1e19d7-beb5-4654-9af0-f2fe4de9d3c8 req-4b95af5c-934e-4a5a-97c7-637325320241 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.510 183087 DEBUG nova.compute.manager [req-4f1e19d7-beb5-4654-9af0-f2fe4de9d3c8 req-4b95af5c-934e-4a5a-97c7-637325320241 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] No waiting events found dispatching network-vif-plugged-a08840be-e8e9-48ba-9780-050a896d2732 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:47:01 compute-1 nova_compute[183083]: 2026-01-26 08:47:01.510 183087 WARNING nova.compute.manager [req-4f1e19d7-beb5-4654-9af0-f2fe4de9d3c8 req-4b95af5c-934e-4a5a-97c7-637325320241 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Received unexpected event network-vif-plugged-a08840be-e8e9-48ba-9780-050a896d2732 for instance with vm_state active and task_state deleting.
Jan 26 08:47:02 compute-1 nova_compute[183083]: 2026-01-26 08:47:02.804 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:03 compute-1 nova_compute[183083]: 2026-01-26 08:47:03.211 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:03 compute-1 nova_compute[183083]: 2026-01-26 08:47:03.676 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:03 compute-1 ovn_controller[95352]: 2026-01-26T08:47:03Z|00116|pinctrl|WARN|Dropped 13611 log messages in last 60 seconds (most recently, 0 seconds ago) due to excessive rate
Jan 26 08:47:03 compute-1 ovn_controller[95352]: 2026-01-26T08:47:03Z|00117|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.448 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Updating instance_info_cache with network_info: [{"id": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "address": "fa:16:3e:dd:ab:11", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b45c93-b3", "ovs_interfaceid": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "address": "fa:16:3e:81:c4:17", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bea873-e0", "ovs_interfaceid": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.484 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Releasing lock "refresh_cache-7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.485 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.485 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.486 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.486 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.487 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.487 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.512 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.513 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.513 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.514 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.607 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.685 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.699 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.701 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.786 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.935 183087 DEBUG nova.network.neutron [-] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:04 compute-1 nova_compute[183083]: 2026-01-26 08:47:04.954 183087 INFO nova.compute.manager [-] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Took 6.20 seconds to deallocate network for instance.
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.006 183087 DEBUG oslo_concurrency.lockutils [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.006 183087 DEBUG oslo_concurrency.lockutils [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.025 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.026 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13618MB free_disk=113.07183837890625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.027 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.275 183087 DEBUG nova.compute.provider_tree [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.290 183087 DEBUG nova.scheduler.client.report [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:47:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:05.298 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:05.299 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:05.300 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.330 183087 DEBUG oslo_concurrency.lockutils [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.334 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.307s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.362 183087 INFO nova.scheduler.client.report [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Deleted allocations for instance f2de61ba-4b20-445c-bdbb-44bb79fb58c3
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.448 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.448 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.449 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.467 183087 DEBUG oslo_concurrency.lockutils [None req-6a471a31-778a-4970-9235-83955f6e85e9 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.468 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 6.856s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.469 183087 INFO nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] During sync_power_state the instance has a pending task (deleting). Skip.
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.469 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "f2de61ba-4b20-445c-bdbb-44bb79fb58c3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.506 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.524 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.549 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.550 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:05 compute-1 nova_compute[183083]: 2026-01-26 08:47:05.550 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:47:05 compute-1 podman[214163]: 2026-01-26 08:47:05.833353432 +0000 UTC m=+0.084549060 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.558 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.558 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.623 183087 DEBUG oslo_concurrency.lockutils [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.623 183087 DEBUG oslo_concurrency.lockutils [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.623 183087 DEBUG oslo_concurrency.lockutils [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.624 183087 DEBUG oslo_concurrency.lockutils [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.624 183087 DEBUG oslo_concurrency.lockutils [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.625 183087 INFO nova.compute.manager [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Terminating instance
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.626 183087 DEBUG nova.compute.manager [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:47:06 compute-1 kernel: tap30b45c93-b3 (unregistering): left promiscuous mode
Jan 26 08:47:06 compute-1 NetworkManager[55451]: <info>  [1769417226.6576] device (tap30b45c93-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.661 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:06 compute-1 ovn_controller[95352]: 2026-01-26T08:47:06Z|00118|binding|INFO|Releasing lport 30b45c93-b3bb-44e6-8e4a-6903a631c773 from this chassis (sb_readonly=0)
Jan 26 08:47:06 compute-1 ovn_controller[95352]: 2026-01-26T08:47:06Z|00119|binding|INFO|Setting lport 30b45c93-b3bb-44e6-8e4a-6903a631c773 down in Southbound
Jan 26 08:47:06 compute-1 ovn_controller[95352]: 2026-01-26T08:47:06Z|00120|binding|INFO|Removing iface tap30b45c93-b3 ovn-installed in OVS
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.666 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:06.674 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:ab:11 192.168.0.175', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.175/24', 'neutron:device_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bad39ade-29c7-41d5-89dd-fc1845e5f3f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4a559c36b13649d98b2995c099340eb9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.249'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7264cb73-6ef7-4995-bc02-8c0dee738bd8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=30b45c93-b3bb-44e6-8e4a-6903a631c773) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:47:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:06.676 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 30b45c93-b3bb-44e6-8e4a-6903a631c773 in datapath bad39ade-29c7-41d5-89dd-fc1845e5f3f2 unbound from our chassis
Jan 26 08:47:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:06.680 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bad39ade-29c7-41d5-89dd-fc1845e5f3f2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 08:47:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:06.681 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[9dcc3431-8197-47a4-a4de-e32cb2a92dab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:06.682 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2 namespace which is not needed anymore
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.692 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:06 compute-1 kernel: tap71bea873-e0 (unregistering): left promiscuous mode
Jan 26 08:47:06 compute-1 NetworkManager[55451]: <info>  [1769417226.7117] device (tap71bea873-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 08:47:06 compute-1 ovn_controller[95352]: 2026-01-26T08:47:06Z|00121|binding|INFO|Releasing lport 71bea873-e0d4-4d4e-b05e-ff7415434c11 from this chassis (sb_readonly=0)
Jan 26 08:47:06 compute-1 ovn_controller[95352]: 2026-01-26T08:47:06Z|00122|binding|INFO|Setting lport 71bea873-e0d4-4d4e-b05e-ff7415434c11 down in Southbound
Jan 26 08:47:06 compute-1 ovn_controller[95352]: 2026-01-26T08:47:06Z|00123|binding|INFO|Removing iface tap71bea873-e0 ovn-installed in OVS
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.719 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:06.730 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:c4:17 192.168.1.11', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.1.11/24', 'neutron:device_id': '7fb5f66c-db87-49bd-8c08-1c21b7ea58e8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-410ad2c8-60c1-40d5-855c-7deeb749f0fe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4a559c36b13649d98b2995c099340eb9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db99b112-04a6-4be6-8e9e-7db1f7ce0209, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=71bea873-e0d4-4d4e-b05e-ff7415434c11) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.722 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.741 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:06 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Jan 26 08:47:06 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d0000000b.scope: Consumed 17.388s CPU time.
Jan 26 08:47:06 compute-1 systemd-machined[154360]: Machine qemu-3-instance-0000000b terminated.
Jan 26 08:47:06 compute-1 neutron-haproxy-ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2[213151]: [NOTICE]   (213155) : haproxy version is 2.8.14-c23fe91
Jan 26 08:47:06 compute-1 neutron-haproxy-ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2[213151]: [NOTICE]   (213155) : path to executable is /usr/sbin/haproxy
Jan 26 08:47:06 compute-1 neutron-haproxy-ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2[213151]: [WARNING]  (213155) : Exiting Master process...
Jan 26 08:47:06 compute-1 neutron-haproxy-ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2[213151]: [ALERT]    (213155) : Current worker (213157) exited with code 143 (Terminated)
Jan 26 08:47:06 compute-1 neutron-haproxy-ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2[213151]: [WARNING]  (213155) : All workers exited. Exiting... (0)
Jan 26 08:47:06 compute-1 systemd[1]: libpod-43d5fb1d3c43f170d0f894c9c62be394998e190bd972dedaad238e2ede3e095c.scope: Deactivated successfully.
Jan 26 08:47:06 compute-1 podman[214218]: 2026-01-26 08:47:06.825366679 +0000 UTC m=+0.049868176 container died 43d5fb1d3c43f170d0f894c9c62be394998e190bd972dedaad238e2ede3e095c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:47:06 compute-1 NetworkManager[55451]: <info>  [1769417226.8448] manager: (tap30b45c93-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Jan 26 08:47:06 compute-1 NetworkManager[55451]: <info>  [1769417226.8601] manager: (tap71bea873-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.898 183087 INFO nova.virt.libvirt.driver [-] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Instance destroyed successfully.
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.899 183087 DEBUG nova.objects.instance [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lazy-loading 'resources' on Instance uuid 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.931 183087 DEBUG nova.virt.libvirt.vif [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T08:45:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1154658508',display_name='tempest-server-test-1154658508',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1154658508',id=11,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBONV6jGPWIKrQk26JKZ8H9h2iNqZUCOpmao7Jq+9fEiq3iYmdJdZreCUr9V3PbbF1TPAOou07OOnfHkvbrEzcfNM5ieiMGZPqHbawuPIe3wilad9S814UZ1oxvh/DW+nZg==',key_name='tempest-keypair-test-867467517',keypairs=<?>,launch_index=0,launched_at=2026-01-26T08:45:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4a559c36b13649d98b2995c099340eb9',ramdisk_id='',reservation_id='r-lygcy6jy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-PortSecurityTest-508365101',owner_user_name='tempest-PortSecurityTest-508365101-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T08:45:50Z,user_data=None,user_id='52d582094c584036ba3e04c9da69ee02',uuid=7fb5f66c-db87-49bd-8c08-1c21b7ea58e8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "address": "fa:16:3e:dd:ab:11", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b45c93-b3", "ovs_interfaceid": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.932 183087 DEBUG nova.network.os_vif_util [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converting VIF {"id": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "address": "fa:16:3e:dd:ab:11", "network": {"id": "bad39ade-29c7-41d5-89dd-fc1845e5f3f2", "bridge": "br-int", "label": "tempest-test-network--1588377042", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b45c93-b3", "ovs_interfaceid": "30b45c93-b3bb-44e6-8e4a-6903a631c773", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.933 183087 DEBUG nova.network.os_vif_util [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:ab:11,bridge_name='br-int',has_traffic_filtering=True,id=30b45c93-b3bb-44e6-8e4a-6903a631c773,network=Network(bad39ade-29c7-41d5-89dd-fc1845e5f3f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap30b45c93-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.934 183087 DEBUG os_vif [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:ab:11,bridge_name='br-int',has_traffic_filtering=True,id=30b45c93-b3bb-44e6-8e4a-6903a631c773,network=Network(bad39ade-29c7-41d5-89dd-fc1845e5f3f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap30b45c93-b3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.935 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.935 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap30b45c93-b3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.937 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.939 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.942 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.945 183087 INFO os_vif [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:ab:11,bridge_name='br-int',has_traffic_filtering=True,id=30b45c93-b3bb-44e6-8e4a-6903a631c773,network=Network(bad39ade-29c7-41d5-89dd-fc1845e5f3f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap30b45c93-b3')
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.946 183087 DEBUG nova.virt.libvirt.vif [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T08:45:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1154658508',display_name='tempest-server-test-1154658508',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1154658508',id=11,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBONV6jGPWIKrQk26JKZ8H9h2iNqZUCOpmao7Jq+9fEiq3iYmdJdZreCUr9V3PbbF1TPAOou07OOnfHkvbrEzcfNM5ieiMGZPqHbawuPIe3wilad9S814UZ1oxvh/DW+nZg==',key_name='tempest-keypair-test-867467517',keypairs=<?>,launch_index=0,launched_at=2026-01-26T08:45:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4a559c36b13649d98b2995c099340eb9',ramdisk_id='',reservation_id='r-lygcy6jy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-PortSecurityTest-508365101',owner_user_name='tempest-PortSecurityTest-508365101-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T08:45:50Z,user_data=None,user_id='52d582094c584036ba3e04c9da69ee02',uuid=7fb5f66c-db87-49bd-8c08-1c21b7ea58e8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "address": "fa:16:3e:81:c4:17", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bea873-e0", "ovs_interfaceid": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.947 183087 DEBUG nova.network.os_vif_util [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converting VIF {"id": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "address": "fa:16:3e:81:c4:17", "network": {"id": "410ad2c8-60c1-40d5-855c-7deeb749f0fe", "bridge": "br-int", "label": "tempest-test-network--1714243150", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "192.168.1.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bea873-e0", "ovs_interfaceid": "71bea873-e0d4-4d4e-b05e-ff7415434c11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.947 183087 DEBUG nova.network.os_vif_util [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:81:c4:17,bridge_name='br-int',has_traffic_filtering=True,id=71bea873-e0d4-4d4e-b05e-ff7415434c11,network=Network(410ad2c8-60c1-40d5-855c-7deeb749f0fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap71bea873-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.948 183087 DEBUG os_vif [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:81:c4:17,bridge_name='br-int',has_traffic_filtering=True,id=71bea873-e0d4-4d4e-b05e-ff7415434c11,network=Network(410ad2c8-60c1-40d5-855c-7deeb749f0fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap71bea873-e0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.950 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:06 compute-1 nova_compute[183083]: 2026-01-26 08:47:06.950 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap71bea873-e0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.013 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.015 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.017 183087 INFO os_vif [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:81:c4:17,bridge_name='br-int',has_traffic_filtering=True,id=71bea873-e0d4-4d4e-b05e-ff7415434c11,network=Network(410ad2c8-60c1-40d5-855c-7deeb749f0fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap71bea873-e0')
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.018 183087 INFO nova.virt.libvirt.driver [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Deleting instance files /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8_del
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.019 183087 INFO nova.virt.libvirt.driver [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Deletion of /var/lib/nova/instances/7fb5f66c-db87-49bd-8c08-1c21b7ea58e8_del complete
Jan 26 08:47:07 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-43d5fb1d3c43f170d0f894c9c62be394998e190bd972dedaad238e2ede3e095c-userdata-shm.mount: Deactivated successfully.
Jan 26 08:47:07 compute-1 systemd[1]: var-lib-containers-storage-overlay-7ff6c543088ae657c42fba9e9a5abc3bede918a7d8f9abb8b05ec5694f65febd-merged.mount: Deactivated successfully.
Jan 26 08:47:07 compute-1 podman[214218]: 2026-01-26 08:47:07.229009672 +0000 UTC m=+0.453511199 container cleanup 43d5fb1d3c43f170d0f894c9c62be394998e190bd972dedaad238e2ede3e095c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:47:07 compute-1 podman[214275]: 2026-01-26 08:47:07.315159036 +0000 UTC m=+0.058598513 container remove 43d5fb1d3c43f170d0f894c9c62be394998e190bd972dedaad238e2ede3e095c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.321 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[12389413-68c8-44a6-9d34-916bf1ada3b0]: (4, ('Mon Jan 26 08:47:06 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2 (43d5fb1d3c43f170d0f894c9c62be394998e190bd972dedaad238e2ede3e095c)\n43d5fb1d3c43f170d0f894c9c62be394998e190bd972dedaad238e2ede3e095c\nMon Jan 26 08:47:07 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2 (43d5fb1d3c43f170d0f894c9c62be394998e190bd972dedaad238e2ede3e095c)\n43d5fb1d3c43f170d0f894c9c62be394998e190bd972dedaad238e2ede3e095c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.324 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[8f1b9859-d788-4691-946b-44405f452045]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.325 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbad39ade-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.327 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:07 compute-1 kernel: tapbad39ade-20: left promiscuous mode
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.352 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.356 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[5501db29-f6d5-4cb6-b32a-bccabd966251]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.373 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[4c87c079-b64b-4047-b35b-94a2e8a63174]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.374 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[8f373de6-15f9-420f-983b-ceb8a9bdeddd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.399 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[39764c5e-b3d8-4b9e-b25f-09f9763ae6e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347675, 'reachable_time': 41625, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214290, 'error': None, 'target': 'ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.403 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bad39ade-29c7-41d5-89dd-fc1845e5f3f2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.403 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[e36ad17d-0412-4376-9747-50ac2267c886]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.405 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 71bea873-e0d4-4d4e-b05e-ff7415434c11 in datapath 410ad2c8-60c1-40d5-855c-7deeb749f0fe unbound from our chassis
Jan 26 08:47:07 compute-1 systemd[1]: run-netns-ovnmeta\x2dbad39ade\x2d29c7\x2d41d5\x2d89dd\x2dfc1845e5f3f2.mount: Deactivated successfully.
Jan 26 08:47:07 compute-1 systemd[1]: libpod-conmon-43d5fb1d3c43f170d0f894c9c62be394998e190bd972dedaad238e2ede3e095c.scope: Deactivated successfully.
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.411 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 410ad2c8-60c1-40d5-855c-7deeb749f0fe, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.412 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[9144bfd2-9311-4e4f-9bff-4f5307f8bb8f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.413 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe namespace which is not needed anymore
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.468 183087 INFO nova.compute.manager [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Took 0.84 seconds to destroy the instance on the hypervisor.
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.469 183087 DEBUG oslo.service.loopingcall [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.469 183087 DEBUG nova.compute.manager [-] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.469 183087 DEBUG nova.network.neutron [-] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:47:07 compute-1 neutron-haproxy-ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe[213234]: [NOTICE]   (213238) : haproxy version is 2.8.14-c23fe91
Jan 26 08:47:07 compute-1 neutron-haproxy-ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe[213234]: [NOTICE]   (213238) : path to executable is /usr/sbin/haproxy
Jan 26 08:47:07 compute-1 neutron-haproxy-ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe[213234]: [WARNING]  (213238) : Exiting Master process...
Jan 26 08:47:07 compute-1 neutron-haproxy-ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe[213234]: [WARNING]  (213238) : Exiting Master process...
Jan 26 08:47:07 compute-1 neutron-haproxy-ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe[213234]: [ALERT]    (213238) : Current worker (213240) exited with code 143 (Terminated)
Jan 26 08:47:07 compute-1 neutron-haproxy-ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe[213234]: [WARNING]  (213238) : All workers exited. Exiting... (0)
Jan 26 08:47:07 compute-1 systemd[1]: libpod-9898a7d0a553e098aa126af9aa87911661805ac1072fac374f8918ebe28d79c4.scope: Deactivated successfully.
Jan 26 08:47:07 compute-1 podman[214308]: 2026-01-26 08:47:07.578303473 +0000 UTC m=+0.044029340 container died 9898a7d0a553e098aa126af9aa87911661805ac1072fac374f8918ebe28d79c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 26 08:47:07 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9898a7d0a553e098aa126af9aa87911661805ac1072fac374f8918ebe28d79c4-userdata-shm.mount: Deactivated successfully.
Jan 26 08:47:07 compute-1 systemd[1]: var-lib-containers-storage-overlay-7bf5f2fe0ea35302d45407473bfbbe722823e2594571d7ebf1abdfb33ec8332f-merged.mount: Deactivated successfully.
Jan 26 08:47:07 compute-1 podman[214308]: 2026-01-26 08:47:07.715736252 +0000 UTC m=+0.181462169 container cleanup 9898a7d0a553e098aa126af9aa87911661805ac1072fac374f8918ebe28d79c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 08:47:07 compute-1 systemd[1]: libpod-conmon-9898a7d0a553e098aa126af9aa87911661805ac1072fac374f8918ebe28d79c4.scope: Deactivated successfully.
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.728 183087 DEBUG oslo_concurrency.lockutils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "95882579-9646-426f-9499-78b4305abe99" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.729 183087 DEBUG oslo_concurrency.lockutils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "95882579-9646-426f-9499-78b4305abe99" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.773 183087 DEBUG nova.compute.manager [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:47:07 compute-1 podman[214340]: 2026-01-26 08:47:07.818860778 +0000 UTC m=+0.077536921 container remove 9898a7d0a553e098aa126af9aa87911661805ac1072fac374f8918ebe28d79c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.825 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[eddab3d8-b859-4934-ba13-98b180bb9aef]: (4, ('Mon Jan 26 08:47:07 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe (9898a7d0a553e098aa126af9aa87911661805ac1072fac374f8918ebe28d79c4)\n9898a7d0a553e098aa126af9aa87911661805ac1072fac374f8918ebe28d79c4\nMon Jan 26 08:47:07 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe (9898a7d0a553e098aa126af9aa87911661805ac1072fac374f8918ebe28d79c4)\n9898a7d0a553e098aa126af9aa87911661805ac1072fac374f8918ebe28d79c4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.827 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[ac47da52-e8c5-4f7c-973a-a80ca48e8664]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.828 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap410ad2c8-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.830 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:07 compute-1 kernel: tap410ad2c8-60: left promiscuous mode
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.856 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.859 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[e85d8644-c9fa-43e5-9b65-76e29d401d04]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.871 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[add9d016-c38f-48c2-a775-2d00ae410198]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.872 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[bfc49282-7ddc-4713-bd89-275bf08f0992]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.881 183087 DEBUG oslo_concurrency.lockutils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.882 183087 DEBUG oslo_concurrency.lockutils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.888 183087 DEBUG nova.virt.hardware [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:47:07 compute-1 nova_compute[183083]: 2026-01-26 08:47:07.889 183087 INFO nova.compute.claims [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.891 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[ade29786-0fb9-4c1e-9407-e3aed0748194]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347776, 'reachable_time': 29846, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214355, 'error': None, 'target': 'ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.893 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-410ad2c8-60c1-40d5-855c-7deeb749f0fe deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 08:47:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:07.894 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[418fdd44-58a4-4b77-abf3-60f019acb2f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:08 compute-1 nova_compute[183083]: 2026-01-26 08:47:08.049 183087 DEBUG nova.compute.provider_tree [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:47:08 compute-1 nova_compute[183083]: 2026-01-26 08:47:08.071 183087 DEBUG nova.scheduler.client.report [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:47:08 compute-1 nova_compute[183083]: 2026-01-26 08:47:08.104 183087 DEBUG oslo_concurrency.lockutils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:08 compute-1 nova_compute[183083]: 2026-01-26 08:47:08.105 183087 DEBUG nova.compute.manager [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:47:08 compute-1 systemd[1]: run-netns-ovnmeta\x2d410ad2c8\x2d60c1\x2d40d5\x2d855c\x2d7deeb749f0fe.mount: Deactivated successfully.
Jan 26 08:47:08 compute-1 nova_compute[183083]: 2026-01-26 08:47:08.234 183087 DEBUG nova.compute.manager [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:47:08 compute-1 nova_compute[183083]: 2026-01-26 08:47:08.235 183087 DEBUG nova.network.neutron [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:47:08 compute-1 nova_compute[183083]: 2026-01-26 08:47:08.336 183087 INFO nova.virt.libvirt.driver [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:47:08 compute-1 nova_compute[183083]: 2026-01-26 08:47:08.366 183087 DEBUG nova.compute.manager [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:47:08 compute-1 nova_compute[183083]: 2026-01-26 08:47:08.546 183087 DEBUG nova.policy [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a7abeebb4e4d469c91e6cee77f6be1c3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b71ae2b9d2fd454b8b3b9aa1a0e5c7e4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:47:08 compute-1 nova_compute[183083]: 2026-01-26 08:47:08.604 183087 DEBUG nova.compute.manager [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:47:08 compute-1 nova_compute[183083]: 2026-01-26 08:47:08.605 183087 DEBUG nova.virt.libvirt.driver [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:47:08 compute-1 nova_compute[183083]: 2026-01-26 08:47:08.606 183087 INFO nova.virt.libvirt.driver [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Creating image(s)
Jan 26 08:47:08 compute-1 nova_compute[183083]: 2026-01-26 08:47:08.606 183087 DEBUG oslo_concurrency.lockutils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "/var/lib/nova/instances/95882579-9646-426f-9499-78b4305abe99/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:08 compute-1 nova_compute[183083]: 2026-01-26 08:47:08.607 183087 DEBUG oslo_concurrency.lockutils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "/var/lib/nova/instances/95882579-9646-426f-9499-78b4305abe99/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:08 compute-1 nova_compute[183083]: 2026-01-26 08:47:08.607 183087 DEBUG oslo_concurrency.lockutils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "/var/lib/nova/instances/95882579-9646-426f-9499-78b4305abe99/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:08 compute-1 nova_compute[183083]: 2026-01-26 08:47:08.608 183087 DEBUG oslo_concurrency.lockutils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:08 compute-1 nova_compute[183083]: 2026-01-26 08:47:08.608 183087 DEBUG oslo_concurrency.lockutils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:09 compute-1 nova_compute[183083]: 2026-01-26 08:47:09.688 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.048 183087 DEBUG oslo_concurrency.lockutils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.440s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99] Traceback (most recent call last):
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]     raise exception.ImageUnacceptable(
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99] 
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99] During handling of the above exception, another exception occurred:
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99] 
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99] Traceback (most recent call last):
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]     yield resources
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]     created_disks = self._create_and_inject_local_root(
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]     image.cache(fetch_func=fetch_func,
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]     return f(*args, **kwargs)
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99]     raise exception.ImageUnacceptable(
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.049 183087 ERROR nova.compute.manager [instance: 95882579-9646-426f-9499-78b4305abe99] 
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.527 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:10 compute-1 nova_compute[183083]: 2026-01-26 08:47:10.806 183087 DEBUG nova.network.neutron [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Successfully created port: b472601f-9e90-40b2-9847-0b3e1a7a4d99 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:47:11 compute-1 nova_compute[183083]: 2026-01-26 08:47:11.051 183087 DEBUG nova.compute.manager [req-85456790-05ff-486d-baa1-29da6be49b1d req-35704f17-4062-4ec9-aa0e-2dd6bbec1d29 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Received event network-vif-plugged-30b45c93-b3bb-44e6-8e4a-6903a631c773 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:47:11 compute-1 nova_compute[183083]: 2026-01-26 08:47:11.052 183087 DEBUG oslo_concurrency.lockutils [req-85456790-05ff-486d-baa1-29da6be49b1d req-35704f17-4062-4ec9-aa0e-2dd6bbec1d29 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:11 compute-1 nova_compute[183083]: 2026-01-26 08:47:11.052 183087 DEBUG oslo_concurrency.lockutils [req-85456790-05ff-486d-baa1-29da6be49b1d req-35704f17-4062-4ec9-aa0e-2dd6bbec1d29 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:11 compute-1 nova_compute[183083]: 2026-01-26 08:47:11.053 183087 DEBUG oslo_concurrency.lockutils [req-85456790-05ff-486d-baa1-29da6be49b1d req-35704f17-4062-4ec9-aa0e-2dd6bbec1d29 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:11 compute-1 nova_compute[183083]: 2026-01-26 08:47:11.053 183087 DEBUG nova.compute.manager [req-85456790-05ff-486d-baa1-29da6be49b1d req-35704f17-4062-4ec9-aa0e-2dd6bbec1d29 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] No waiting events found dispatching network-vif-plugged-30b45c93-b3bb-44e6-8e4a-6903a631c773 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:47:11 compute-1 nova_compute[183083]: 2026-01-26 08:47:11.053 183087 WARNING nova.compute.manager [req-85456790-05ff-486d-baa1-29da6be49b1d req-35704f17-4062-4ec9-aa0e-2dd6bbec1d29 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Received unexpected event network-vif-plugged-30b45c93-b3bb-44e6-8e4a-6903a631c773 for instance with vm_state active and task_state deleting.
Jan 26 08:47:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:11.306 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8d:c4:3f'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-9d2c9283-4c3d-4a45-a920-3cc48d5615dc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9d2c9283-4c3d-4a45-a920-3cc48d5615dc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a3694415e0ac483fa070e7316b146fc1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de20ec35-7108-465a-a8b0-4f26ef1c24ed, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=17c08760-1e61-4068-a6f8-c1529ba343f1) old=Port_Binding(mac=['fa:16:3e:8d:c4:3f 10.10.2.2'], external_ids={'neutron:cidrs': '10.10.2.2/24', 'neutron:device_id': 'ovnmeta-9d2c9283-4c3d-4a45-a920-3cc48d5615dc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9d2c9283-4c3d-4a45-a920-3cc48d5615dc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a3694415e0ac483fa070e7316b146fc1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:47:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:11.308 104632 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 17c08760-1e61-4068-a6f8-c1529ba343f1 in datapath 9d2c9283-4c3d-4a45-a920-3cc48d5615dc updated
Jan 26 08:47:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:11.311 104632 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 9d2c9283-4c3d-4a45-a920-3cc48d5615dc or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 26 08:47:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:11.312 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f096f9-8b17-45c3-b6bd-5b397f22faad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:11 compute-1 nova_compute[183083]: 2026-01-26 08:47:11.616 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:12 compute-1 nova_compute[183083]: 2026-01-26 08:47:12.015 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:12 compute-1 nova_compute[183083]: 2026-01-26 08:47:12.588 183087 DEBUG nova.network.neutron [-] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:12 compute-1 nova_compute[183083]: 2026-01-26 08:47:12.629 183087 INFO nova.compute.manager [-] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Took 5.16 seconds to deallocate network for instance.
Jan 26 08:47:12 compute-1 nova_compute[183083]: 2026-01-26 08:47:12.678 183087 DEBUG oslo_concurrency.lockutils [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:12 compute-1 nova_compute[183083]: 2026-01-26 08:47:12.679 183087 DEBUG oslo_concurrency.lockutils [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:12 compute-1 nova_compute[183083]: 2026-01-26 08:47:12.803 183087 DEBUG nova.compute.provider_tree [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:47:12 compute-1 nova_compute[183083]: 2026-01-26 08:47:12.819 183087 DEBUG nova.scheduler.client.report [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:47:12 compute-1 nova_compute[183083]: 2026-01-26 08:47:12.845 183087 DEBUG oslo_concurrency.lockutils [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:12 compute-1 nova_compute[183083]: 2026-01-26 08:47:12.866 183087 INFO nova.scheduler.client.report [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Deleted allocations for instance 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8
Jan 26 08:47:12 compute-1 nova_compute[183083]: 2026-01-26 08:47:12.959 183087 DEBUG oslo_concurrency.lockutils [None req-1203d985-68e2-4a08-b67e-6988cd130708 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:12 compute-1 nova_compute[183083]: 2026-01-26 08:47:12.964 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.177 183087 DEBUG nova.compute.manager [req-4c0a51bc-0472-4b6d-8c5a-844ef391aef7 req-a7600a4e-3d0f-46b5-a625-872f9d4926e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Received event network-vif-unplugged-71bea873-e0d4-4d4e-b05e-ff7415434c11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.178 183087 DEBUG oslo_concurrency.lockutils [req-4c0a51bc-0472-4b6d-8c5a-844ef391aef7 req-a7600a4e-3d0f-46b5-a625-872f9d4926e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.179 183087 DEBUG oslo_concurrency.lockutils [req-4c0a51bc-0472-4b6d-8c5a-844ef391aef7 req-a7600a4e-3d0f-46b5-a625-872f9d4926e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.180 183087 DEBUG oslo_concurrency.lockutils [req-4c0a51bc-0472-4b6d-8c5a-844ef391aef7 req-a7600a4e-3d0f-46b5-a625-872f9d4926e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.180 183087 DEBUG nova.compute.manager [req-4c0a51bc-0472-4b6d-8c5a-844ef391aef7 req-a7600a4e-3d0f-46b5-a625-872f9d4926e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] No waiting events found dispatching network-vif-unplugged-71bea873-e0d4-4d4e-b05e-ff7415434c11 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.181 183087 WARNING nova.compute.manager [req-4c0a51bc-0472-4b6d-8c5a-844ef391aef7 req-a7600a4e-3d0f-46b5-a625-872f9d4926e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Received unexpected event network-vif-unplugged-71bea873-e0d4-4d4e-b05e-ff7415434c11 for instance with vm_state deleted and task_state None.
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.181 183087 DEBUG nova.compute.manager [req-4c0a51bc-0472-4b6d-8c5a-844ef391aef7 req-a7600a4e-3d0f-46b5-a625-872f9d4926e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Received event network-vif-plugged-71bea873-e0d4-4d4e-b05e-ff7415434c11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.182 183087 DEBUG oslo_concurrency.lockutils [req-4c0a51bc-0472-4b6d-8c5a-844ef391aef7 req-a7600a4e-3d0f-46b5-a625-872f9d4926e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.182 183087 DEBUG oslo_concurrency.lockutils [req-4c0a51bc-0472-4b6d-8c5a-844ef391aef7 req-a7600a4e-3d0f-46b5-a625-872f9d4926e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.183 183087 DEBUG oslo_concurrency.lockutils [req-4c0a51bc-0472-4b6d-8c5a-844ef391aef7 req-a7600a4e-3d0f-46b5-a625-872f9d4926e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "7fb5f66c-db87-49bd-8c08-1c21b7ea58e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.183 183087 DEBUG nova.compute.manager [req-4c0a51bc-0472-4b6d-8c5a-844ef391aef7 req-a7600a4e-3d0f-46b5-a625-872f9d4926e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] No waiting events found dispatching network-vif-plugged-71bea873-e0d4-4d4e-b05e-ff7415434c11 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.184 183087 WARNING nova.compute.manager [req-4c0a51bc-0472-4b6d-8c5a-844ef391aef7 req-a7600a4e-3d0f-46b5-a625-872f9d4926e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Received unexpected event network-vif-plugged-71bea873-e0d4-4d4e-b05e-ff7415434c11 for instance with vm_state deleted and task_state None.
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.350 183087 DEBUG nova.network.neutron [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Successfully updated port: b472601f-9e90-40b2-9847-0b3e1a7a4d99 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.369 183087 DEBUG oslo_concurrency.lockutils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "refresh_cache-95882579-9646-426f-9499-78b4305abe99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.370 183087 DEBUG oslo_concurrency.lockutils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquired lock "refresh_cache-95882579-9646-426f-9499-78b4305abe99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.370 183087 DEBUG nova.network.neutron [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.612 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769417218.6106327, f2de61ba-4b20-445c-bdbb-44bb79fb58c3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.612 183087 INFO nova.compute.manager [-] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] VM Stopped (Lifecycle Event)
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.634 183087 DEBUG nova.compute.manager [None req-57607a19-6201-4636-8e0f-35933c06c9bc - - - - - -] [instance: f2de61ba-4b20-445c-bdbb-44bb79fb58c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:47:13 compute-1 nova_compute[183083]: 2026-01-26 08:47:13.893 183087 DEBUG nova.network.neutron [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:47:14 compute-1 nova_compute[183083]: 2026-01-26 08:47:14.691 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.147 183087 DEBUG nova.network.neutron [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Updating instance_info_cache with network_info: [{"id": "b472601f-9e90-40b2-9847-0b3e1a7a4d99", "address": "fa:16:3e:7c:63:d0", "network": {"id": "badf05d3-4d1f-4b97-ad5f-3ca57765729f", "bridge": "br-int", "label": "tempest-test-network--227464583", "subnets": [{"cidr": "192.168.4.0/24", "dns": [], "gateway": {"address": "192.168.4.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.4.56", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb472601f-9e", "ovs_interfaceid": "b472601f-9e90-40b2-9847-0b3e1a7a4d99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.170 183087 DEBUG oslo_concurrency.lockutils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Releasing lock "refresh_cache-95882579-9646-426f-9499-78b4305abe99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.170 183087 DEBUG nova.compute.manager [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Instance network_info: |[{"id": "b472601f-9e90-40b2-9847-0b3e1a7a4d99", "address": "fa:16:3e:7c:63:d0", "network": {"id": "badf05d3-4d1f-4b97-ad5f-3ca57765729f", "bridge": "br-int", "label": "tempest-test-network--227464583", "subnets": [{"cidr": "192.168.4.0/24", "dns": [], "gateway": {"address": "192.168.4.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.4.56", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb472601f-9e", "ovs_interfaceid": "b472601f-9e90-40b2-9847-0b3e1a7a4d99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.171 183087 INFO nova.compute.manager [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Terminating instance
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.173 183087 DEBUG nova.compute.manager [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.177 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 95882579-9646-426f-9499-78b4305abe99] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.178 183087 INFO nova.virt.libvirt.driver [-] [instance: 95882579-9646-426f-9499-78b4305abe99] Instance destroyed successfully.
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.179 183087 DEBUG nova.virt.libvirt.vif [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:47:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_marking_tenant_network-243479984',display_name='tempest-test_dscp_marking_tenant_network-243479984',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-marking-tenant-network-243479984',id=22,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHVjp+2pOh+xYUkttf/EHrrYAH3LBOn+IKLzf3fiQpiaJslqkY+OmJn6bfd2cX/NEPdTL45qAcY0Zt6OwZRQXbHCoOcvnydr7uXjZCoGXOxoNL1bEhwXU4AaOmmyDzyYAA==',key_name='tempest-keypair-test-1026532318',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b71ae2b9d2fd454b8b3b9aa1a0e5c7e4',ramdisk_id='',reservation_id='r-h00pnr7a',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-374727467',owner_user_name='tempest-QosTestCommon-374727467-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:47:08Z,user_data=None,user_id='a7abeebb4e4d469c91e6cee77f6be1c3',uuid=95882579-9646-426f-9499-78b4305abe99,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b472601f-9e90-40b2-9847-0b3e1a7a4d99", "address": "fa:16:3e:7c:63:d0", "network": {"id": "badf05d3-4d1f-4b97-ad5f-3ca57765729f", "bridge": "br-int", "label": "tempest-test-network--227464583", "subnets": [{"cidr": "192.168.4.0/24", "dns": [], "gateway": {"address": "192.168.4.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.4.56", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb472601f-9e", "ovs_interfaceid": "b472601f-9e90-40b2-9847-0b3e1a7a4d99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.179 183087 DEBUG nova.network.os_vif_util [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converting VIF {"id": "b472601f-9e90-40b2-9847-0b3e1a7a4d99", "address": "fa:16:3e:7c:63:d0", "network": {"id": "badf05d3-4d1f-4b97-ad5f-3ca57765729f", "bridge": "br-int", "label": "tempest-test-network--227464583", "subnets": [{"cidr": "192.168.4.0/24", "dns": [], "gateway": {"address": "192.168.4.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.4.56", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb472601f-9e", "ovs_interfaceid": "b472601f-9e90-40b2-9847-0b3e1a7a4d99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.180 183087 DEBUG nova.network.os_vif_util [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:63:d0,bridge_name='br-int',has_traffic_filtering=True,id=b472601f-9e90-40b2-9847-0b3e1a7a4d99,network=Network(badf05d3-4d1f-4b97-ad5f-3ca57765729f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb472601f-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.180 183087 DEBUG os_vif [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:63:d0,bridge_name='br-int',has_traffic_filtering=True,id=b472601f-9e90-40b2-9847-0b3e1a7a4d99,network=Network(badf05d3-4d1f-4b97-ad5f-3ca57765729f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb472601f-9e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.182 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.183 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb472601f-9e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.183 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.185 183087 INFO os_vif [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:63:d0,bridge_name='br-int',has_traffic_filtering=True,id=b472601f-9e90-40b2-9847-0b3e1a7a4d99,network=Network(badf05d3-4d1f-4b97-ad5f-3ca57765729f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb472601f-9e')
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.186 183087 INFO nova.virt.libvirt.driver [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Deleting instance files /var/lib/nova/instances/95882579-9646-426f-9499-78b4305abe99_del
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.187 183087 INFO nova.virt.libvirt.driver [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Deletion of /var/lib/nova/instances/95882579-9646-426f-9499-78b4305abe99_del complete
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.239 183087 INFO nova.compute.manager [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Took 0.07 seconds to destroy the instance on the hypervisor.
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.240 183087 DEBUG nova.compute.claims [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c982337c0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.240 183087 DEBUG oslo_concurrency.lockutils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.241 183087 DEBUG oslo_concurrency.lockutils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.290 183087 DEBUG nova.compute.manager [req-ad396c70-ed70-4002-8ea6-31f04716e4a1 req-265f6133-fa84-4509-8fbe-a0147ffcf98d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Received event network-changed-b472601f-9e90-40b2-9847-0b3e1a7a4d99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.291 183087 DEBUG nova.compute.manager [req-ad396c70-ed70-4002-8ea6-31f04716e4a1 req-265f6133-fa84-4509-8fbe-a0147ffcf98d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Refreshing instance network info cache due to event network-changed-b472601f-9e90-40b2-9847-0b3e1a7a4d99. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.291 183087 DEBUG oslo_concurrency.lockutils [req-ad396c70-ed70-4002-8ea6-31f04716e4a1 req-265f6133-fa84-4509-8fbe-a0147ffcf98d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-95882579-9646-426f-9499-78b4305abe99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.291 183087 DEBUG oslo_concurrency.lockutils [req-ad396c70-ed70-4002-8ea6-31f04716e4a1 req-265f6133-fa84-4509-8fbe-a0147ffcf98d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-95882579-9646-426f-9499-78b4305abe99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.292 183087 DEBUG nova.network.neutron [req-ad396c70-ed70-4002-8ea6-31f04716e4a1 req-265f6133-fa84-4509-8fbe-a0147ffcf98d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Refreshing network info cache for port b472601f-9e90-40b2-9847-0b3e1a7a4d99 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.346 183087 DEBUG nova.compute.provider_tree [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.366 183087 DEBUG nova.scheduler.client.report [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.387 183087 DEBUG oslo_concurrency.lockutils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.388 183087 DEBUG nova.compute.utils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.388 183087 ERROR nova.compute.manager [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Build of instance 95882579-9646-426f-9499-78b4305abe99 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 95882579-9646-426f-9499-78b4305abe99 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.389 183087 DEBUG nova.compute.manager [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.390 183087 DEBUG nova.virt.libvirt.vif [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:47:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_marking_tenant_network-243479984',display_name='tempest-test_dscp_marking_tenant_network-243479984',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-test-dscp-marking-tenant-network-243479984',id=22,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHVjp+2pOh+xYUkttf/EHrrYAH3LBOn+IKLzf3fiQpiaJslqkY+OmJn6bfd2cX/NEPdTL45qAcY0Zt6OwZRQXbHCoOcvnydr7uXjZCoGXOxoNL1bEhwXU4AaOmmyDzyYAA==',key_name='tempest-keypair-test-1026532318',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b71ae2b9d2fd454b8b3b9aa1a0e5c7e4',ramdisk_id='',reservation_id='r-h00pnr7a',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-374727467',owner_user_name='tempest-QosTestCommon-374727467-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:47:15Z,user_data=None,user_id='a7abeebb4e4d469c91e6cee77f6be1c3',uuid=95882579-9646-426f-9499-78b4305abe99,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b472601f-9e90-40b2-9847-0b3e1a7a4d99", "address": "fa:16:3e:7c:63:d0", "network": {"id": "badf05d3-4d1f-4b97-ad5f-3ca57765729f", "bridge": "br-int", "label": "tempest-test-network--227464583", "subnets": [{"cidr": "192.168.4.0/24", "dns": [], "gateway": {"address": "192.168.4.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.4.56", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb472601f-9e", "ovs_interfaceid": "b472601f-9e90-40b2-9847-0b3e1a7a4d99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.390 183087 DEBUG nova.network.os_vif_util [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converting VIF {"id": "b472601f-9e90-40b2-9847-0b3e1a7a4d99", "address": "fa:16:3e:7c:63:d0", "network": {"id": "badf05d3-4d1f-4b97-ad5f-3ca57765729f", "bridge": "br-int", "label": "tempest-test-network--227464583", "subnets": [{"cidr": "192.168.4.0/24", "dns": [], "gateway": {"address": "192.168.4.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.4.56", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb472601f-9e", "ovs_interfaceid": "b472601f-9e90-40b2-9847-0b3e1a7a4d99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.390 183087 DEBUG nova.network.os_vif_util [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:63:d0,bridge_name='br-int',has_traffic_filtering=True,id=b472601f-9e90-40b2-9847-0b3e1a7a4d99,network=Network(badf05d3-4d1f-4b97-ad5f-3ca57765729f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb472601f-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.391 183087 DEBUG os_vif [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:63:d0,bridge_name='br-int',has_traffic_filtering=True,id=b472601f-9e90-40b2-9847-0b3e1a7a4d99,network=Network(badf05d3-4d1f-4b97-ad5f-3ca57765729f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb472601f-9e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.393 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.393 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb472601f-9e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.394 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.395 183087 INFO os_vif [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:63:d0,bridge_name='br-int',has_traffic_filtering=True,id=b472601f-9e90-40b2-9847-0b3e1a7a4d99,network=Network(badf05d3-4d1f-4b97-ad5f-3ca57765729f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb472601f-9e')
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.396 183087 DEBUG nova.compute.manager [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.396 183087 DEBUG nova.compute.manager [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:47:15 compute-1 nova_compute[183083]: 2026-01-26 08:47:15.396 183087 DEBUG nova.network.neutron [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:47:16 compute-1 nova_compute[183083]: 2026-01-26 08:47:16.188 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:16 compute-1 nova_compute[183083]: 2026-01-26 08:47:16.414 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:17 compute-1 nova_compute[183083]: 2026-01-26 08:47:17.038 183087 DEBUG nova.network.neutron [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:17 compute-1 nova_compute[183083]: 2026-01-26 08:47:17.050 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:17 compute-1 nova_compute[183083]: 2026-01-26 08:47:17.060 183087 INFO nova.compute.manager [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Took 1.66 seconds to deallocate network for instance.
Jan 26 08:47:17 compute-1 nova_compute[183083]: 2026-01-26 08:47:17.219 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:17 compute-1 nova_compute[183083]: 2026-01-26 08:47:17.258 183087 INFO nova.scheduler.client.report [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Deleted allocations for instance 95882579-9646-426f-9499-78b4305abe99
Jan 26 08:47:17 compute-1 nova_compute[183083]: 2026-01-26 08:47:17.258 183087 DEBUG oslo_concurrency.lockutils [None req-202bf173-03d9-4a16-8c99-eef229aed01f a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "95882579-9646-426f-9499-78b4305abe99" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.530s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:18 compute-1 nova_compute[183083]: 2026-01-26 08:47:18.454 183087 DEBUG nova.network.neutron [req-ad396c70-ed70-4002-8ea6-31f04716e4a1 req-265f6133-fa84-4509-8fbe-a0147ffcf98d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Updated VIF entry in instance network info cache for port b472601f-9e90-40b2-9847-0b3e1a7a4d99. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:47:18 compute-1 nova_compute[183083]: 2026-01-26 08:47:18.455 183087 DEBUG nova.network.neutron [req-ad396c70-ed70-4002-8ea6-31f04716e4a1 req-265f6133-fa84-4509-8fbe-a0147ffcf98d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 95882579-9646-426f-9499-78b4305abe99] Updating instance_info_cache with network_info: [{"id": "b472601f-9e90-40b2-9847-0b3e1a7a4d99", "address": "fa:16:3e:7c:63:d0", "network": {"id": "badf05d3-4d1f-4b97-ad5f-3ca57765729f", "bridge": "br-int", "label": "tempest-test-network--227464583", "subnets": [{"cidr": "192.168.4.0/24", "dns": [], "gateway": {"address": "192.168.4.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.4.56", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb472601f-9e", "ovs_interfaceid": "b472601f-9e90-40b2-9847-0b3e1a7a4d99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:18 compute-1 nova_compute[183083]: 2026-01-26 08:47:18.473 183087 DEBUG oslo_concurrency.lockutils [req-ad396c70-ed70-4002-8ea6-31f04716e4a1 req-265f6133-fa84-4509-8fbe-a0147ffcf98d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-95882579-9646-426f-9499-78b4305abe99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:47:18 compute-1 podman[214356]: 2026-01-26 08:47:18.849886367 +0000 UTC m=+0.103131667 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 08:47:18 compute-1 podman[214357]: 2026-01-26 08:47:18.849031343 +0000 UTC m=+0.099988458 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, architecture=x86_64, distribution-scope=public)
Jan 26 08:47:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:19.495 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:35:c2 192.168.7.2 2001:7::f816:3eff:fe82:35c2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.7.2/24 2001:7::f816:3eff:fe82:35c2/64', 'neutron:device_id': 'ovnmeta-cefb6a0c-33c1-4909-ac56-bbfd65612a52', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cefb6a0c-33c1-4909-ac56-bbfd65612a52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71cced1777f24868932d789154ff04a0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d704a136-b8e5-4296-8e5d-118a96fdba90, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2bec4fdb-0b3f-4aba-9774-1a8afabf6309) old=Port_Binding(mac=['fa:16:3e:82:35:c2 192.168.7.2'], external_ids={'neutron:cidrs': '192.168.7.2/24', 'neutron:device_id': 'ovnmeta-cefb6a0c-33c1-4909-ac56-bbfd65612a52', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cefb6a0c-33c1-4909-ac56-bbfd65612a52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71cced1777f24868932d789154ff04a0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:47:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:19.497 104632 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2bec4fdb-0b3f-4aba-9774-1a8afabf6309 in datapath cefb6a0c-33c1-4909-ac56-bbfd65612a52 updated
Jan 26 08:47:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:19.501 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cefb6a0c-33c1-4909-ac56-bbfd65612a52, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 08:47:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:19.502 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[67501bbb-56dc-4c5f-8055-5bce37f85614]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:19 compute-1 nova_compute[183083]: 2026-01-26 08:47:19.736 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:21 compute-1 nova_compute[183083]: 2026-01-26 08:47:21.778 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Acquiring lock "5ea5b3e6-d6ee-4984-b938-32f34a5c3307" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:21 compute-1 nova_compute[183083]: 2026-01-26 08:47:21.779 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lock "5ea5b3e6-d6ee-4984-b938-32f34a5c3307" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:21 compute-1 nova_compute[183083]: 2026-01-26 08:47:21.805 183087 DEBUG nova.compute.manager [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:47:21 compute-1 nova_compute[183083]: 2026-01-26 08:47:21.880 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:21 compute-1 nova_compute[183083]: 2026-01-26 08:47:21.881 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:21 compute-1 nova_compute[183083]: 2026-01-26 08:47:21.891 183087 DEBUG nova.virt.hardware [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:47:21 compute-1 nova_compute[183083]: 2026-01-26 08:47:21.891 183087 INFO nova.compute.claims [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:47:21 compute-1 nova_compute[183083]: 2026-01-26 08:47:21.896 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769417226.8962612, 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:47:21 compute-1 nova_compute[183083]: 2026-01-26 08:47:21.897 183087 INFO nova.compute.manager [-] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] VM Stopped (Lifecycle Event)
Jan 26 08:47:21 compute-1 nova_compute[183083]: 2026-01-26 08:47:21.935 183087 DEBUG nova.compute.manager [None req-c0bd65ea-81a6-4d4c-9fea-5978a853fc27 - - - - - -] [instance: 7fb5f66c-db87-49bd-8c08-1c21b7ea58e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.018 183087 DEBUG nova.compute.provider_tree [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.034 183087 DEBUG nova.scheduler.client.report [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.060 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.060 183087 DEBUG nova.compute.manager [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.092 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.114 183087 DEBUG nova.compute.manager [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.114 183087 DEBUG nova.network.neutron [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.136 183087 INFO nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.159 183087 DEBUG nova.compute.manager [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.284 183087 DEBUG nova.compute.manager [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.286 183087 DEBUG nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.287 183087 INFO nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Creating image(s)
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.287 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Acquiring lock "/var/lib/nova/instances/5ea5b3e6-d6ee-4984-b938-32f34a5c3307/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.288 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lock "/var/lib/nova/instances/5ea5b3e6-d6ee-4984-b938-32f34a5c3307/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.289 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lock "/var/lib/nova/instances/5ea5b3e6-d6ee-4984-b938-32f34a5c3307/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.312 183087 DEBUG oslo_concurrency.processutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.400 183087 DEBUG oslo_concurrency.processutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.402 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.403 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.426 183087 DEBUG oslo_concurrency.processutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.506 183087 DEBUG oslo_concurrency.processutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.508 183087 DEBUG oslo_concurrency.processutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/5ea5b3e6-d6ee-4984-b938-32f34a5c3307/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.537 183087 DEBUG nova.policy [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6b6bdb69392c4caba079d21935c2fe08', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5cb899ce02444af2a1e102e390417350', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.562 183087 DEBUG oslo_concurrency.processutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/5ea5b3e6-d6ee-4984-b938-32f34a5c3307/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.564 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.564 183087 DEBUG oslo_concurrency.processutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.652 183087 DEBUG oslo_concurrency.processutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.654 183087 DEBUG nova.virt.disk.api [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Checking if we can resize image /var/lib/nova/instances/5ea5b3e6-d6ee-4984-b938-32f34a5c3307/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.654 183087 DEBUG oslo_concurrency.processutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ea5b3e6-d6ee-4984-b938-32f34a5c3307/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.759 183087 DEBUG oslo_concurrency.processutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ea5b3e6-d6ee-4984-b938-32f34a5c3307/disk --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.760 183087 DEBUG nova.virt.disk.api [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Cannot resize image /var/lib/nova/instances/5ea5b3e6-d6ee-4984-b938-32f34a5c3307/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.761 183087 DEBUG nova.objects.instance [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lazy-loading 'migration_context' on Instance uuid 5ea5b3e6-d6ee-4984-b938-32f34a5c3307 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.783 183087 DEBUG nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.784 183087 DEBUG nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Ensure instance console log exists: /var/lib/nova/instances/5ea5b3e6-d6ee-4984-b938-32f34a5c3307/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.785 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.785 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:22 compute-1 nova_compute[183083]: 2026-01-26 08:47:22.786 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.509 183087 DEBUG oslo_concurrency.lockutils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "b5659d91-1ee7-4f5f-a30e-2aa14d866ceb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.509 183087 DEBUG oslo_concurrency.lockutils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "b5659d91-1ee7-4f5f-a30e-2aa14d866ceb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.534 183087 DEBUG nova.compute.manager [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.550 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.600 183087 DEBUG oslo_concurrency.lockutils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.600 183087 DEBUG oslo_concurrency.lockutils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.607 183087 DEBUG nova.virt.hardware [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.608 183087 INFO nova.compute.claims [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.714 183087 DEBUG nova.compute.provider_tree [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.726 183087 DEBUG nova.scheduler.client.report [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.747 183087 DEBUG oslo_concurrency.lockutils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.747 183087 DEBUG nova.compute.manager [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.788 183087 DEBUG nova.compute.manager [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.789 183087 DEBUG nova.network.neutron [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.817 183087 INFO nova.virt.libvirt.driver [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.937 183087 DEBUG nova.network.neutron [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Successfully updated port: 2fe37e18-bc67-418d-b6bb-db4ed40f6645 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.938 183087 DEBUG nova.compute.manager [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.969 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Acquiring lock "refresh_cache-5ea5b3e6-d6ee-4984-b938-32f34a5c3307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.969 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Acquired lock "refresh_cache-5ea5b3e6-d6ee-4984-b938-32f34a5c3307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:47:23 compute-1 nova_compute[183083]: 2026-01-26 08:47:23.969 183087 DEBUG nova.network.neutron [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.015 183087 DEBUG nova.compute.manager [req-d197d813-84ec-4b88-891f-455d1d5902b5 req-4513d8d2-d120-4719-a30a-7e5e604f2662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Received event network-changed-2fe37e18-bc67-418d-b6bb-db4ed40f6645 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.015 183087 DEBUG nova.compute.manager [req-d197d813-84ec-4b88-891f-455d1d5902b5 req-4513d8d2-d120-4719-a30a-7e5e604f2662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Refreshing instance network info cache due to event network-changed-2fe37e18-bc67-418d-b6bb-db4ed40f6645. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.015 183087 DEBUG oslo_concurrency.lockutils [req-d197d813-84ec-4b88-891f-455d1d5902b5 req-4513d8d2-d120-4719-a30a-7e5e604f2662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-5ea5b3e6-d6ee-4984-b938-32f34a5c3307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.078 183087 DEBUG nova.compute.manager [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.079 183087 DEBUG nova.virt.libvirt.driver [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.079 183087 INFO nova.virt.libvirt.driver [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Creating image(s)
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.080 183087 DEBUG oslo_concurrency.lockutils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "/var/lib/nova/instances/b5659d91-1ee7-4f5f-a30e-2aa14d866ceb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.080 183087 DEBUG oslo_concurrency.lockutils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "/var/lib/nova/instances/b5659d91-1ee7-4f5f-a30e-2aa14d866ceb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.081 183087 DEBUG oslo_concurrency.lockutils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "/var/lib/nova/instances/b5659d91-1ee7-4f5f-a30e-2aa14d866ceb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.081 183087 DEBUG oslo_concurrency.lockutils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.082 183087 DEBUG oslo_concurrency.lockutils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.186 183087 DEBUG nova.network.neutron [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.337 183087 DEBUG nova.policy [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e29b04a4a66d43aaa5e5c4f38eeb59c4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5c33bd5e85114c868a4e91d997a5ceec', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.739 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.970 183087 DEBUG nova.network.neutron [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Updating instance_info_cache with network_info: [{"id": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "address": "fa:16:3e:80:c5:e2", "network": {"id": "4a023433-18d0-4d94-b2a4-84ce73a46ce0", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1072359363", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cb899ce02444af2a1e102e390417350", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fe37e18-bc", "ovs_interfaceid": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.992 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Releasing lock "refresh_cache-5ea5b3e6-d6ee-4984-b938-32f34a5c3307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.993 183087 DEBUG nova.compute.manager [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Instance network_info: |[{"id": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "address": "fa:16:3e:80:c5:e2", "network": {"id": "4a023433-18d0-4d94-b2a4-84ce73a46ce0", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1072359363", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cb899ce02444af2a1e102e390417350", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fe37e18-bc", "ovs_interfaceid": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.994 183087 DEBUG oslo_concurrency.lockutils [req-d197d813-84ec-4b88-891f-455d1d5902b5 req-4513d8d2-d120-4719-a30a-7e5e604f2662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-5ea5b3e6-d6ee-4984-b938-32f34a5c3307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:47:24 compute-1 nova_compute[183083]: 2026-01-26 08:47:24.995 183087 DEBUG nova.network.neutron [req-d197d813-84ec-4b88-891f-455d1d5902b5 req-4513d8d2-d120-4719-a30a-7e5e604f2662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Refreshing network info cache for port 2fe37e18-bc67-418d-b6bb-db4ed40f6645 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.000 183087 DEBUG nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Start _get_guest_xml network_info=[{"id": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "address": "fa:16:3e:80:c5:e2", "network": {"id": "4a023433-18d0-4d94-b2a4-84ce73a46ce0", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1072359363", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cb899ce02444af2a1e102e390417350", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fe37e18-bc", "ovs_interfaceid": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encryption_options': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.013 183087 WARNING nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.023 183087 DEBUG nova.virt.libvirt.host [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.024 183087 DEBUG nova.virt.libvirt.host [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.029 183087 DEBUG nova.virt.libvirt.host [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.030 183087 DEBUG nova.virt.libvirt.host [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.031 183087 DEBUG nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.031 183087 DEBUG nova.virt.hardware [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T08:43:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7017323-cd35-470d-a718-faa6c6e97277',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.032 183087 DEBUG nova.virt.hardware [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.033 183087 DEBUG nova.virt.hardware [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.033 183087 DEBUG nova.virt.hardware [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.034 183087 DEBUG nova.virt.hardware [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.035 183087 DEBUG nova.virt.hardware [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.035 183087 DEBUG nova.virt.hardware [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.036 183087 DEBUG nova.virt.hardware [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.037 183087 DEBUG nova.virt.hardware [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.038 183087 DEBUG nova.virt.hardware [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.039 183087 DEBUG nova.virt.hardware [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.046 183087 DEBUG nova.virt.libvirt.vif [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:47:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-internal-dns-test-vm-147244364',display_name='tempest-internal-dns-test-vm-147244364',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-internal-dns-test-vm-147244364',id=23,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNwasPBLZFj5TPdNWQK0zigHHODYc4f/t8DVnPh2+8sHe1pRQgl5SU5AsNYCK+hfXouKyyxOmTCGDhQxnfKOl2CR4mUnhy56u5TB5HoFqRICBu27dyxnXxwslEVaAoSfxw==',key_name='tempest-internal-dns-test-shared-keypair-1237484570',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5cb899ce02444af2a1e102e390417350',ramdisk_id='',reservation_id='r-ibp6cyse',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InternalDNSTestOvn-655560763',owner_user_name='tempest-InternalDNSTestOvn-655560763-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:47:22Z,user_data=None,user_id='6b6bdb69392c4caba079d21935c2fe08',uuid=5ea5b3e6-d6ee-4984-b938-32f34a5c3307,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "address": "fa:16:3e:80:c5:e2", "network": {"id": "4a023433-18d0-4d94-b2a4-84ce73a46ce0", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1072359363", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cb899ce02444af2a1e102e390417350", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fe37e18-bc", "ovs_interfaceid": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.047 183087 DEBUG nova.network.os_vif_util [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Converting VIF {"id": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "address": "fa:16:3e:80:c5:e2", "network": {"id": "4a023433-18d0-4d94-b2a4-84ce73a46ce0", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1072359363", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cb899ce02444af2a1e102e390417350", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fe37e18-bc", "ovs_interfaceid": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.049 183087 DEBUG nova.network.os_vif_util [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:c5:e2,bridge_name='br-int',has_traffic_filtering=True,id=2fe37e18-bc67-418d-b6bb-db4ed40f6645,network=Network(4a023433-18d0-4d94-b2a4-84ce73a46ce0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2fe37e18-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.050 183087 DEBUG nova.objects.instance [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5ea5b3e6-d6ee-4984-b938-32f34a5c3307 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.067 183087 DEBUG nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] End _get_guest_xml xml=<domain type="kvm">
Jan 26 08:47:25 compute-1 nova_compute[183083]:   <uuid>5ea5b3e6-d6ee-4984-b938-32f34a5c3307</uuid>
Jan 26 08:47:25 compute-1 nova_compute[183083]:   <name>instance-00000017</name>
Jan 26 08:47:25 compute-1 nova_compute[183083]:   <memory>131072</memory>
Jan 26 08:47:25 compute-1 nova_compute[183083]:   <vcpu>1</vcpu>
Jan 26 08:47:25 compute-1 nova_compute[183083]:   <metadata>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <nova:name>tempest-internal-dns-test-vm-147244364</nova:name>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <nova:creationTime>2026-01-26 08:47:25</nova:creationTime>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <nova:flavor name="m1.nano">
Jan 26 08:47:25 compute-1 nova_compute[183083]:         <nova:memory>128</nova:memory>
Jan 26 08:47:25 compute-1 nova_compute[183083]:         <nova:disk>1</nova:disk>
Jan 26 08:47:25 compute-1 nova_compute[183083]:         <nova:swap>0</nova:swap>
Jan 26 08:47:25 compute-1 nova_compute[183083]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 08:47:25 compute-1 nova_compute[183083]:         <nova:vcpus>1</nova:vcpus>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       </nova:flavor>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <nova:owner>
Jan 26 08:47:25 compute-1 nova_compute[183083]:         <nova:user uuid="6b6bdb69392c4caba079d21935c2fe08">tempest-InternalDNSTestOvn-655560763-project-member</nova:user>
Jan 26 08:47:25 compute-1 nova_compute[183083]:         <nova:project uuid="5cb899ce02444af2a1e102e390417350">tempest-InternalDNSTestOvn-655560763</nova:project>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       </nova:owner>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <nova:root type="image" uuid="13d1a20a-8003-4f19-aba7-ccbd9eff9b82"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <nova:ports>
Jan 26 08:47:25 compute-1 nova_compute[183083]:         <nova:port uuid="2fe37e18-bc67-418d-b6bb-db4ed40f6645">
Jan 26 08:47:25 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       </nova:ports>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     </nova:instance>
Jan 26 08:47:25 compute-1 nova_compute[183083]:   </metadata>
Jan 26 08:47:25 compute-1 nova_compute[183083]:   <sysinfo type="smbios">
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <system>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <entry name="manufacturer">RDO</entry>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <entry name="product">OpenStack Compute</entry>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <entry name="serial">5ea5b3e6-d6ee-4984-b938-32f34a5c3307</entry>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <entry name="uuid">5ea5b3e6-d6ee-4984-b938-32f34a5c3307</entry>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <entry name="family">Virtual Machine</entry>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     </system>
Jan 26 08:47:25 compute-1 nova_compute[183083]:   </sysinfo>
Jan 26 08:47:25 compute-1 nova_compute[183083]:   <os>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <boot dev="hd"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <smbios mode="sysinfo"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:   </os>
Jan 26 08:47:25 compute-1 nova_compute[183083]:   <features>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <acpi/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <apic/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <vmcoreinfo/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:   </features>
Jan 26 08:47:25 compute-1 nova_compute[183083]:   <clock offset="utc">
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <timer name="hpet" present="no"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:   </clock>
Jan 26 08:47:25 compute-1 nova_compute[183083]:   <cpu mode="host-model" match="exact">
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:   </cpu>
Jan 26 08:47:25 compute-1 nova_compute[183083]:   <devices>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <disk type="file" device="disk">
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/5ea5b3e6-d6ee-4984-b938-32f34a5c3307/disk"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <target dev="vda" bus="virtio"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <disk type="file" device="cdrom">
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/5ea5b3e6-d6ee-4984-b938-32f34a5c3307/disk.config"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <target dev="sda" bus="sata"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:80:c5:e2"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <target dev="tap2fe37e18-bc"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <serial type="pty">
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <log file="/var/lib/nova/instances/5ea5b3e6-d6ee-4984-b938-32f34a5c3307/console.log" append="off"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     </serial>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <video>
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     </video>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <input type="tablet" bus="usb"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <rng model="virtio">
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <backend model="random">/dev/urandom</backend>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     </rng>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <controller type="usb" index="0"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     <memballoon model="virtio">
Jan 26 08:47:25 compute-1 nova_compute[183083]:       <stats period="10"/>
Jan 26 08:47:25 compute-1 nova_compute[183083]:     </memballoon>
Jan 26 08:47:25 compute-1 nova_compute[183083]:   </devices>
Jan 26 08:47:25 compute-1 nova_compute[183083]: </domain>
Jan 26 08:47:25 compute-1 nova_compute[183083]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.070 183087 DEBUG nova.compute.manager [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Preparing to wait for external event network-vif-plugged-2fe37e18-bc67-418d-b6bb-db4ed40f6645 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.071 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Acquiring lock "5ea5b3e6-d6ee-4984-b938-32f34a5c3307-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.071 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lock "5ea5b3e6-d6ee-4984-b938-32f34a5c3307-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.071 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lock "5ea5b3e6-d6ee-4984-b938-32f34a5c3307-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.072 183087 DEBUG nova.virt.libvirt.vif [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:47:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-internal-dns-test-vm-147244364',display_name='tempest-internal-dns-test-vm-147244364',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-internal-dns-test-vm-147244364',id=23,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNwasPBLZFj5TPdNWQK0zigHHODYc4f/t8DVnPh2+8sHe1pRQgl5SU5AsNYCK+hfXouKyyxOmTCGDhQxnfKOl2CR4mUnhy56u5TB5HoFqRICBu27dyxnXxwslEVaAoSfxw==',key_name='tempest-internal-dns-test-shared-keypair-1237484570',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5cb899ce02444af2a1e102e390417350',ramdisk_id='',reservation_id='r-ibp6cyse',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InternalDNSTestOvn-655560763',owner_user_name='tempest-InternalDNSTestOvn-655560763-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:47:22Z,user_data=None,user_id='6b6bdb69392c4caba079d21935c2fe08',uuid=5ea5b3e6-d6ee-4984-b938-32f34a5c3307,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "address": "fa:16:3e:80:c5:e2", "network": {"id": "4a023433-18d0-4d94-b2a4-84ce73a46ce0", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1072359363", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cb899ce02444af2a1e102e390417350", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fe37e18-bc", "ovs_interfaceid": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.073 183087 DEBUG nova.network.os_vif_util [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Converting VIF {"id": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "address": "fa:16:3e:80:c5:e2", "network": {"id": "4a023433-18d0-4d94-b2a4-84ce73a46ce0", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1072359363", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cb899ce02444af2a1e102e390417350", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fe37e18-bc", "ovs_interfaceid": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.073 183087 DEBUG nova.network.os_vif_util [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:c5:e2,bridge_name='br-int',has_traffic_filtering=True,id=2fe37e18-bc67-418d-b6bb-db4ed40f6645,network=Network(4a023433-18d0-4d94-b2a4-84ce73a46ce0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2fe37e18-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.074 183087 DEBUG os_vif [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:c5:e2,bridge_name='br-int',has_traffic_filtering=True,id=2fe37e18-bc67-418d-b6bb-db4ed40f6645,network=Network(4a023433-18d0-4d94-b2a4-84ce73a46ce0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2fe37e18-bc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.075 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.075 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.076 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.079 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.079 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2fe37e18-bc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.080 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2fe37e18-bc, col_values=(('external_ids', {'iface-id': '2fe37e18-bc67-418d-b6bb-db4ed40f6645', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:80:c5:e2', 'vm-uuid': '5ea5b3e6-d6ee-4984-b938-32f34a5c3307'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.082 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:25 compute-1 NetworkManager[55451]: <info>  [1769417245.0839] manager: (tap2fe37e18-bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.089 183087 DEBUG oslo_concurrency.lockutils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Traceback (most recent call last):
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]     raise exception.ImageUnacceptable(
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] 
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] During handling of the above exception, another exception occurred:
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] 
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Traceback (most recent call last):
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]     yield resources
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]     created_disks = self._create_and_inject_local_root(
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]     image.cache(fetch_func=fetch_func,
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]     return f(*args, **kwargs)
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb]     raise exception.ImageUnacceptable(
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.091 183087 ERROR nova.compute.manager [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] 
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.093 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.094 183087 INFO os_vif [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:c5:e2,bridge_name='br-int',has_traffic_filtering=True,id=2fe37e18-bc67-418d-b6bb-db4ed40f6645,network=Network(4a023433-18d0-4d94-b2a4-84ce73a46ce0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2fe37e18-bc')
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.163 183087 DEBUG nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.163 183087 DEBUG nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.164 183087 DEBUG nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] No VIF found with MAC fa:16:3e:80:c5:e2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 08:47:25 compute-1 nova_compute[183083]: 2026-01-26 08:47:25.165 183087 INFO nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Using config drive
Jan 26 08:47:25 compute-1 podman[214416]: 2026-01-26 08:47:25.238317939 +0000 UTC m=+0.094111491 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 08:47:25 compute-1 podman[214415]: 2026-01-26 08:47:25.274136205 +0000 UTC m=+0.136539784 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 26 08:47:26 compute-1 nova_compute[183083]: 2026-01-26 08:47:26.144 183087 INFO nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Creating config drive at /var/lib/nova/instances/5ea5b3e6-d6ee-4984-b938-32f34a5c3307/disk.config
Jan 26 08:47:26 compute-1 nova_compute[183083]: 2026-01-26 08:47:26.153 183087 DEBUG oslo_concurrency.processutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5ea5b3e6-d6ee-4984-b938-32f34a5c3307/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3_s13jxm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:47:26 compute-1 nova_compute[183083]: 2026-01-26 08:47:26.299 183087 DEBUG nova.network.neutron [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Successfully created port: 5932b648-47f4-4559-9fda-67b22b6f1540 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:47:26 compute-1 nova_compute[183083]: 2026-01-26 08:47:26.305 183087 DEBUG oslo_concurrency.processutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5ea5b3e6-d6ee-4984-b938-32f34a5c3307/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3_s13jxm" returned: 0 in 0.152s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:47:26 compute-1 kernel: tap2fe37e18-bc: entered promiscuous mode
Jan 26 08:47:26 compute-1 nova_compute[183083]: 2026-01-26 08:47:26.394 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:26 compute-1 NetworkManager[55451]: <info>  [1769417246.3968] manager: (tap2fe37e18-bc): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Jan 26 08:47:26 compute-1 ovn_controller[95352]: 2026-01-26T08:47:26Z|00124|binding|INFO|Claiming lport 2fe37e18-bc67-418d-b6bb-db4ed40f6645 for this chassis.
Jan 26 08:47:26 compute-1 ovn_controller[95352]: 2026-01-26T08:47:26Z|00125|binding|INFO|2fe37e18-bc67-418d-b6bb-db4ed40f6645: Claiming fa:16:3e:80:c5:e2 10.100.0.3
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.408 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:c5:e2 10.100.0.3'], port_security=['fa:16:3e:80:c5:e2 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-internal-dns-test-port-1268661230', 'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '5ea5b3e6-d6ee-4984-b938-32f34a5c3307', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a023433-18d0-4d94-b2a4-84ce73a46ce0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-internal-dns-test-port-1268661230', 'neutron:project_id': '5cb899ce02444af2a1e102e390417350', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'af0d8f5a-e36f-4fde-9724-59b4ac44631c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ba9abea6-197c-4ec3-be8d-b9829bf5806a, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=2fe37e18-bc67-418d-b6bb-db4ed40f6645) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.410 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 2fe37e18-bc67-418d-b6bb-db4ed40f6645 in datapath 4a023433-18d0-4d94-b2a4-84ce73a46ce0 bound to our chassis
Jan 26 08:47:26 compute-1 nova_compute[183083]: 2026-01-26 08:47:26.413 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.415 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a023433-18d0-4d94-b2a4-84ce73a46ce0
Jan 26 08:47:26 compute-1 ovn_controller[95352]: 2026-01-26T08:47:26Z|00126|binding|INFO|Setting lport 2fe37e18-bc67-418d-b6bb-db4ed40f6645 ovn-installed in OVS
Jan 26 08:47:26 compute-1 ovn_controller[95352]: 2026-01-26T08:47:26Z|00127|binding|INFO|Setting lport 2fe37e18-bc67-418d-b6bb-db4ed40f6645 up in Southbound
Jan 26 08:47:26 compute-1 nova_compute[183083]: 2026-01-26 08:47:26.418 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:26 compute-1 nova_compute[183083]: 2026-01-26 08:47:26.423 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.439 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[fa10adf2-2047-45b1-abe0-c631be385a4c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.440 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4a023433-11 in ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.443 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4a023433-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.443 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[8bb05d8e-554f-46aa-92d2-b2fd22a859e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.445 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[0fa67869-0493-41d3-bc93-5ba59245a681]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:26 compute-1 systemd-machined[154360]: New machine qemu-6-instance-00000017.
Jan 26 08:47:26 compute-1 systemd-udevd[214482]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.470 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[df1d20fd-6697-46bd-96d0-014f0e6087b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:26 compute-1 systemd[1]: Started Virtual Machine qemu-6-instance-00000017.
Jan 26 08:47:26 compute-1 NetworkManager[55451]: <info>  [1769417246.4843] device (tap2fe37e18-bc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 08:47:26 compute-1 NetworkManager[55451]: <info>  [1769417246.4857] device (tap2fe37e18-bc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.505 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[99898150-ef97-4a9a-9755-c6790bd3bfd4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.549 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[cf26d776-a7fe-4c54-973c-3a14df0e4e54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.556 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[66d1f3fb-da9a-4a39-8568-3a3f8326275b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:26 compute-1 NetworkManager[55451]: <info>  [1769417246.5578] manager: (tap4a023433-10): new Veth device (/org/freedesktop/NetworkManager/Devices/54)
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.606 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[ae7c7ebf-0feb-44fa-90e3-d52730e7e3a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.610 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[45a626d2-6d24-44f1-aa4c-6f39a434ae87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:26 compute-1 NetworkManager[55451]: <info>  [1769417246.6443] device (tap4a023433-10): carrier: link connected
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.651 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[a65e3d0c-e7eb-44ef-bd63-bca204e8c4d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.677 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[2a7b6da0-b6da-4614-9dc0-464a886f52f5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a023433-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:f9:aa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358724, 'reachable_time': 21881, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214514, 'error': None, 'target': 'ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.691 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[c08397ec-c83e-405a-aac8-15bae7fbf260]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:f9aa'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358724, 'tstamp': 358724}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214515, 'error': None, 'target': 'ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.721 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[1cc94ccd-9657-48d6-b741-323cfd338ddd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a023433-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:f9:aa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358724, 'reachable_time': 21881, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214516, 'error': None, 'target': 'ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.772 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[d63b41a5-5ab9-4006-b7ad-1cba1aaf16e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.853 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[8fcef986-65eb-4ec2-a04a-03e5e1cd5f9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.855 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a023433-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.855 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.855 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a023433-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:26 compute-1 NetworkManager[55451]: <info>  [1769417246.8584] manager: (tap4a023433-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Jan 26 08:47:26 compute-1 kernel: tap4a023433-10: entered promiscuous mode
Jan 26 08:47:26 compute-1 nova_compute[183083]: 2026-01-26 08:47:26.863 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.864 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a023433-10, col_values=(('external_ids', {'iface-id': '69612433-eca7-46c4-8771-8fe67d0df630'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:26 compute-1 ovn_controller[95352]: 2026-01-26T08:47:26Z|00128|binding|INFO|Releasing lport 69612433-eca7-46c4-8771-8fe67d0df630 from this chassis (sb_readonly=0)
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.874 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4a023433-18d0-4d94-b2a4-84ce73a46ce0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4a023433-18d0-4d94-b2a4-84ce73a46ce0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.876 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[233f1150-40ee-4cb0-8e17-1bd04fdae94b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.877 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: global
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-4a023433-18d0-4d94-b2a4-84ce73a46ce0
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/4a023433-18d0-4d94-b2a4-84ce73a46ce0.pid.haproxy
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID 4a023433-18d0-4d94-b2a4-84ce73a46ce0
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 08:47:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:26.880 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0', 'env', 'PROCESS_TAG=haproxy-4a023433-18d0-4d94-b2a4-84ce73a46ce0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4a023433-18d0-4d94-b2a4-84ce73a46ce0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 08:47:26 compute-1 nova_compute[183083]: 2026-01-26 08:47:26.882 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:26 compute-1 nova_compute[183083]: 2026-01-26 08:47:26.977 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417246.976309, 5ea5b3e6-d6ee-4984-b938-32f34a5c3307 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:47:26 compute-1 nova_compute[183083]: 2026-01-26 08:47:26.978 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] VM Started (Lifecycle Event)
Jan 26 08:47:27 compute-1 nova_compute[183083]: 2026-01-26 08:47:27.002 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:47:27 compute-1 nova_compute[183083]: 2026-01-26 08:47:27.007 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417246.9766629, 5ea5b3e6-d6ee-4984-b938-32f34a5c3307 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:47:27 compute-1 nova_compute[183083]: 2026-01-26 08:47:27.008 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] VM Paused (Lifecycle Event)
Jan 26 08:47:27 compute-1 nova_compute[183083]: 2026-01-26 08:47:27.010 183087 DEBUG nova.network.neutron [req-d197d813-84ec-4b88-891f-455d1d5902b5 req-4513d8d2-d120-4719-a30a-7e5e604f2662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Updated VIF entry in instance network info cache for port 2fe37e18-bc67-418d-b6bb-db4ed40f6645. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:47:27 compute-1 nova_compute[183083]: 2026-01-26 08:47:27.011 183087 DEBUG nova.network.neutron [req-d197d813-84ec-4b88-891f-455d1d5902b5 req-4513d8d2-d120-4719-a30a-7e5e604f2662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Updating instance_info_cache with network_info: [{"id": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "address": "fa:16:3e:80:c5:e2", "network": {"id": "4a023433-18d0-4d94-b2a4-84ce73a46ce0", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1072359363", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cb899ce02444af2a1e102e390417350", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fe37e18-bc", "ovs_interfaceid": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:27 compute-1 nova_compute[183083]: 2026-01-26 08:47:27.031 183087 DEBUG oslo_concurrency.lockutils [req-d197d813-84ec-4b88-891f-455d1d5902b5 req-4513d8d2-d120-4719-a30a-7e5e604f2662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-5ea5b3e6-d6ee-4984-b938-32f34a5c3307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:47:27 compute-1 nova_compute[183083]: 2026-01-26 08:47:27.034 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:47:27 compute-1 nova_compute[183083]: 2026-01-26 08:47:27.039 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:47:27 compute-1 nova_compute[183083]: 2026-01-26 08:47:27.058 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:47:27 compute-1 nova_compute[183083]: 2026-01-26 08:47:27.319 183087 DEBUG nova.network.neutron [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Successfully updated port: 5932b648-47f4-4559-9fda-67b22b6f1540 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:47:27 compute-1 podman[214555]: 2026-01-26 08:47:27.3244719 +0000 UTC m=+0.077704576 container create 05e03eb6feac7543d278786e8fb3f24786f5f6a1f1d77223817078bc39271957 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 26 08:47:27 compute-1 nova_compute[183083]: 2026-01-26 08:47:27.336 183087 DEBUG oslo_concurrency.lockutils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "refresh_cache-b5659d91-1ee7-4f5f-a30e-2aa14d866ceb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:47:27 compute-1 nova_compute[183083]: 2026-01-26 08:47:27.337 183087 DEBUG oslo_concurrency.lockutils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquired lock "refresh_cache-b5659d91-1ee7-4f5f-a30e-2aa14d866ceb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:47:27 compute-1 nova_compute[183083]: 2026-01-26 08:47:27.337 183087 DEBUG nova.network.neutron [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:47:27 compute-1 systemd[1]: Started libpod-conmon-05e03eb6feac7543d278786e8fb3f24786f5f6a1f1d77223817078bc39271957.scope.
Jan 26 08:47:27 compute-1 podman[214555]: 2026-01-26 08:47:27.286991597 +0000 UTC m=+0.040224313 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 08:47:27 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:47:27 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4691f3920eaae2c58c6535349688ac71bad6defb9eea29f6c76ef492147dd356/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 08:47:27 compute-1 podman[214555]: 2026-01-26 08:47:27.437695043 +0000 UTC m=+0.190927689 container init 05e03eb6feac7543d278786e8fb3f24786f5f6a1f1d77223817078bc39271957 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 26 08:47:27 compute-1 podman[214555]: 2026-01-26 08:47:27.444101694 +0000 UTC m=+0.197334330 container start 05e03eb6feac7543d278786e8fb3f24786f5f6a1f1d77223817078bc39271957 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 26 08:47:27 compute-1 neutron-haproxy-ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0[214571]: [NOTICE]   (214588) : New worker (214592) forked
Jan 26 08:47:27 compute-1 neutron-haproxy-ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0[214571]: [NOTICE]   (214588) : Loading success.
Jan 26 08:47:27 compute-1 nova_compute[183083]: 2026-01-26 08:47:27.470 183087 DEBUG nova.network.neutron [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:47:27 compute-1 podman[214568]: 2026-01-26 08:47:27.489701528 +0000 UTC m=+0.100351018 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 26 08:47:27 compute-1 nova_compute[183083]: 2026-01-26 08:47:27.592 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.267 183087 DEBUG nova.network.neutron [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Updating instance_info_cache with network_info: [{"id": "5932b648-47f4-4559-9fda-67b22b6f1540", "address": "fa:16:3e:c7:4c:07", "network": {"id": "ada33edd-017f-4cf5-bc2e-db0c66214549", "bridge": "br-int", "label": "tempest-test-network--1775908954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5932b648-47", "ovs_interfaceid": "5932b648-47f4-4559-9fda-67b22b6f1540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.292 183087 DEBUG oslo_concurrency.lockutils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Releasing lock "refresh_cache-b5659d91-1ee7-4f5f-a30e-2aa14d866ceb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.293 183087 DEBUG nova.compute.manager [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Instance network_info: |[{"id": "5932b648-47f4-4559-9fda-67b22b6f1540", "address": "fa:16:3e:c7:4c:07", "network": {"id": "ada33edd-017f-4cf5-bc2e-db0c66214549", "bridge": "br-int", "label": "tempest-test-network--1775908954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5932b648-47", "ovs_interfaceid": "5932b648-47f4-4559-9fda-67b22b6f1540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.294 183087 INFO nova.compute.manager [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Terminating instance
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.295 183087 DEBUG nova.compute.manager [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.301 183087 DEBUG nova.virt.libvirt.driver [-] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.301 183087 INFO nova.virt.libvirt.driver [-] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Instance destroyed successfully.
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.302 183087 DEBUG nova.virt.libvirt.vif [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:47:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_idle_timeout_with_querier_enabled-1588371512',display_name='tempest-test_idle_timeout_with_querier_enabled-1588371512',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-idle-timeout-with-querier-enabled-1588371512',id=24,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJOJsa8zDc5tOBfBRLm0Qi812u7HVOO6E7MXAGpKZ4/7op/PkClYPLhHmkNgkuxZ09O67J1SPnUd6CxZdN+euoFRi16VYgHaqYzJwS9D1WUns9/BQk7M5SX/0drgiuby9w==',key_name='tempest-keypair-test-1151010750',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c33bd5e85114c868a4e91d997a5ceec',ramdisk_id='',reservation_id='r-0n2sdizz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-635971062',owner_user_name='tempest-MulticastTestIPv4Ovn-635971062-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:47:23Z,user_data=None,user_id='e29b04a4a66d43aaa5e5c4f38eeb59c4',uuid=b5659d91-1ee7-4f5f-a30e-2aa14d866ceb,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5932b648-47f4-4559-9fda-67b22b6f1540", "address": "fa:16:3e:c7:4c:07", "network": {"id": "ada33edd-017f-4cf5-bc2e-db0c66214549", "bridge": "br-int", "label": "tempest-test-network--1775908954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5932b648-47", "ovs_interfaceid": "5932b648-47f4-4559-9fda-67b22b6f1540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.303 183087 DEBUG nova.network.os_vif_util [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Converting VIF {"id": "5932b648-47f4-4559-9fda-67b22b6f1540", "address": "fa:16:3e:c7:4c:07", "network": {"id": "ada33edd-017f-4cf5-bc2e-db0c66214549", "bridge": "br-int", "label": "tempest-test-network--1775908954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5932b648-47", "ovs_interfaceid": "5932b648-47f4-4559-9fda-67b22b6f1540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.304 183087 DEBUG nova.network.os_vif_util [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:4c:07,bridge_name='br-int',has_traffic_filtering=True,id=5932b648-47f4-4559-9fda-67b22b6f1540,network=Network(ada33edd-017f-4cf5-bc2e-db0c66214549),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5932b648-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.305 183087 DEBUG os_vif [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:4c:07,bridge_name='br-int',has_traffic_filtering=True,id=5932b648-47f4-4559-9fda-67b22b6f1540,network=Network(ada33edd-017f-4cf5-bc2e-db0c66214549),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5932b648-47') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.308 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.308 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5932b648-47, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.309 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.312 183087 INFO os_vif [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:4c:07,bridge_name='br-int',has_traffic_filtering=True,id=5932b648-47f4-4559-9fda-67b22b6f1540,network=Network(ada33edd-017f-4cf5-bc2e-db0c66214549),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5932b648-47')
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.313 183087 INFO nova.virt.libvirt.driver [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Deleting instance files /var/lib/nova/instances/b5659d91-1ee7-4f5f-a30e-2aa14d866ceb_del
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.314 183087 INFO nova.virt.libvirt.driver [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Deletion of /var/lib/nova/instances/b5659d91-1ee7-4f5f-a30e-2aa14d866ceb_del complete
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.389 183087 INFO nova.compute.manager [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Took 0.09 seconds to destroy the instance on the hypervisor.
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.391 183087 DEBUG nova.compute.claims [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c9840a790> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.392 183087 DEBUG oslo_concurrency.lockutils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.392 183087 DEBUG oslo_concurrency.lockutils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.529 183087 DEBUG nova.compute.provider_tree [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.554 183087 DEBUG nova.scheduler.client.report [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.581 183087 DEBUG oslo_concurrency.lockutils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.583 183087 DEBUG nova.compute.utils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.585 183087 ERROR nova.compute.manager [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Build of instance b5659d91-1ee7-4f5f-a30e-2aa14d866ceb aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance b5659d91-1ee7-4f5f-a30e-2aa14d866ceb aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.586 183087 DEBUG nova.compute.manager [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.588 183087 DEBUG nova.virt.libvirt.vif [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:47:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_idle_timeout_with_querier_enabled-1588371512',display_name='tempest-test_idle_timeout_with_querier_enabled-1588371512',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-test-idle-timeout-with-querier-enabled-1588371512',id=24,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJOJsa8zDc5tOBfBRLm0Qi812u7HVOO6E7MXAGpKZ4/7op/PkClYPLhHmkNgkuxZ09O67J1SPnUd6CxZdN+euoFRi16VYgHaqYzJwS9D1WUns9/BQk7M5SX/0drgiuby9w==',key_name='tempest-keypair-test-1151010750',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c33bd5e85114c868a4e91d997a5ceec',ramdisk_id='',reservation_id='r-0n2sdizz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-635971062',owner_user_name='tempest-MulticastTestIPv4Ovn-635971062-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:47:28Z,user_data=None,user_id='e29b04a4a66d43aaa5e5c4f38eeb59c4',uuid=b5659d91-1ee7-4f5f-a30e-2aa14d866ceb,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5932b648-47f4-4559-9fda-67b22b6f1540", "address": "fa:16:3e:c7:4c:07", "network": {"id": "ada33edd-017f-4cf5-bc2e-db0c66214549", "bridge": "br-int", "label": "tempest-test-network--1775908954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5932b648-47", "ovs_interfaceid": "5932b648-47f4-4559-9fda-67b22b6f1540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.588 183087 DEBUG nova.network.os_vif_util [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Converting VIF {"id": "5932b648-47f4-4559-9fda-67b22b6f1540", "address": "fa:16:3e:c7:4c:07", "network": {"id": "ada33edd-017f-4cf5-bc2e-db0c66214549", "bridge": "br-int", "label": "tempest-test-network--1775908954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5932b648-47", "ovs_interfaceid": "5932b648-47f4-4559-9fda-67b22b6f1540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.590 183087 DEBUG nova.network.os_vif_util [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:4c:07,bridge_name='br-int',has_traffic_filtering=True,id=5932b648-47f4-4559-9fda-67b22b6f1540,network=Network(ada33edd-017f-4cf5-bc2e-db0c66214549),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5932b648-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.591 183087 DEBUG os_vif [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:4c:07,bridge_name='br-int',has_traffic_filtering=True,id=5932b648-47f4-4559-9fda-67b22b6f1540,network=Network(ada33edd-017f-4cf5-bc2e-db0c66214549),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5932b648-47') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.593 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.594 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5932b648-47, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.594 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.599 183087 INFO os_vif [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:4c:07,bridge_name='br-int',has_traffic_filtering=True,id=5932b648-47f4-4559-9fda-67b22b6f1540,network=Network(ada33edd-017f-4cf5-bc2e-db0c66214549),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5932b648-47')
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.600 183087 DEBUG nova.compute.manager [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.600 183087 DEBUG nova.compute.manager [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:47:28 compute-1 nova_compute[183083]: 2026-01-26 08:47:28.600 183087 DEBUG nova.network.neutron [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:47:29 compute-1 nova_compute[183083]: 2026-01-26 08:47:29.209 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:29 compute-1 nova_compute[183083]: 2026-01-26 08:47:29.376 183087 DEBUG nova.network.neutron [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:29 compute-1 nova_compute[183083]: 2026-01-26 08:47:29.426 183087 INFO nova.compute.manager [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: b5659d91-1ee7-4f5f-a30e-2aa14d866ceb] Took 0.83 seconds to deallocate network for instance.
Jan 26 08:47:29 compute-1 nova_compute[183083]: 2026-01-26 08:47:29.605 183087 INFO nova.scheduler.client.report [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Deleted allocations for instance b5659d91-1ee7-4f5f-a30e-2aa14d866ceb
Jan 26 08:47:29 compute-1 nova_compute[183083]: 2026-01-26 08:47:29.606 183087 DEBUG oslo_concurrency.lockutils [None req-dab20003-61d9-4c37-871b-5036b4568351 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "b5659d91-1ee7-4f5f-a30e-2aa14d866ceb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:29 compute-1 nova_compute[183083]: 2026-01-26 08:47:29.783 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.082 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.229 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.230 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.254 183087 DEBUG nova.compute.manager [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.349 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.350 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.358 183087 DEBUG nova.virt.hardware [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.359 183087 INFO nova.compute.claims [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.510 183087 DEBUG nova.compute.provider_tree [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.529 183087 DEBUG nova.scheduler.client.report [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.553 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.554 183087 DEBUG nova.compute.manager [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.601 183087 DEBUG nova.compute.manager [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.602 183087 DEBUG nova.network.neutron [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.623 183087 INFO nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.647 183087 DEBUG nova.compute.manager [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.745 183087 DEBUG nova.compute.manager [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.747 183087 DEBUG nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.748 183087 INFO nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Creating image(s)
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.748 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "/var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.749 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "/var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.749 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "/var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.768 183087 DEBUG oslo_concurrency.processutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.861 183087 DEBUG oslo_concurrency.processutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.862 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.862 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.877 183087 DEBUG oslo_concurrency.processutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.944 183087 DEBUG oslo_concurrency.processutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.945 183087 DEBUG oslo_concurrency.processutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.981 183087 DEBUG oslo_concurrency.processutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.982 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:30 compute-1 nova_compute[183083]: 2026-01-26 08:47:30.982 183087 DEBUG oslo_concurrency.processutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:47:31 compute-1 nova_compute[183083]: 2026-01-26 08:47:31.002 183087 DEBUG nova.policy [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:47:31 compute-1 nova_compute[183083]: 2026-01-26 08:47:31.032 183087 DEBUG oslo_concurrency.processutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:47:31 compute-1 nova_compute[183083]: 2026-01-26 08:47:31.032 183087 DEBUG nova.virt.disk.api [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Checking if we can resize image /var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 08:47:31 compute-1 nova_compute[183083]: 2026-01-26 08:47:31.033 183087 DEBUG oslo_concurrency.processutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:47:31 compute-1 nova_compute[183083]: 2026-01-26 08:47:31.088 183087 DEBUG oslo_concurrency.processutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:47:31 compute-1 nova_compute[183083]: 2026-01-26 08:47:31.089 183087 DEBUG nova.virt.disk.api [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Cannot resize image /var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 08:47:31 compute-1 nova_compute[183083]: 2026-01-26 08:47:31.090 183087 DEBUG nova.objects.instance [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lazy-loading 'migration_context' on Instance uuid addae953-8eb4-46ed-959d-3c2bb6b31ee3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:47:31 compute-1 nova_compute[183083]: 2026-01-26 08:47:31.104 183087 DEBUG nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 08:47:31 compute-1 nova_compute[183083]: 2026-01-26 08:47:31.105 183087 DEBUG nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Ensure instance console log exists: /var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 08:47:31 compute-1 nova_compute[183083]: 2026-01-26 08:47:31.105 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:31 compute-1 nova_compute[183083]: 2026-01-26 08:47:31.106 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:31 compute-1 nova_compute[183083]: 2026-01-26 08:47:31.106 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:32 compute-1 nova_compute[183083]: 2026-01-26 08:47:32.004 183087 DEBUG nova.network.neutron [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Successfully updated port: 09872a2c-368e-4dca-93d8-5fdd642b03b3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:47:32 compute-1 nova_compute[183083]: 2026-01-26 08:47:32.021 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "refresh_cache-addae953-8eb4-46ed-959d-3c2bb6b31ee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:47:32 compute-1 nova_compute[183083]: 2026-01-26 08:47:32.021 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquired lock "refresh_cache-addae953-8eb4-46ed-959d-3c2bb6b31ee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:47:32 compute-1 nova_compute[183083]: 2026-01-26 08:47:32.021 183087 DEBUG nova.network.neutron [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:47:32 compute-1 nova_compute[183083]: 2026-01-26 08:47:32.224 183087 DEBUG nova.network.neutron [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.103 183087 DEBUG nova.network.neutron [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Updating instance_info_cache with network_info: [{"id": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "address": "fa:16:3e:90:46:e8", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09872a2c-36", "ovs_interfaceid": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.185 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Releasing lock "refresh_cache-addae953-8eb4-46ed-959d-3c2bb6b31ee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.185 183087 DEBUG nova.compute.manager [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Instance network_info: |[{"id": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "address": "fa:16:3e:90:46:e8", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09872a2c-36", "ovs_interfaceid": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.188 183087 DEBUG nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Start _get_guest_xml network_info=[{"id": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "address": "fa:16:3e:90:46:e8", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09872a2c-36", "ovs_interfaceid": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encryption_options': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.192 183087 WARNING nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.199 183087 DEBUG nova.virt.libvirt.host [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.200 183087 DEBUG nova.virt.libvirt.host [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.207 183087 DEBUG nova.virt.libvirt.host [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.207 183087 DEBUG nova.virt.libvirt.host [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.207 183087 DEBUG nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.208 183087 DEBUG nova.virt.hardware [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T08:43:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7017323-cd35-470d-a718-faa6c6e97277',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.208 183087 DEBUG nova.virt.hardware [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.208 183087 DEBUG nova.virt.hardware [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.209 183087 DEBUG nova.virt.hardware [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.209 183087 DEBUG nova.virt.hardware [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.209 183087 DEBUG nova.virt.hardware [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.209 183087 DEBUG nova.virt.hardware [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.209 183087 DEBUG nova.virt.hardware [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.210 183087 DEBUG nova.virt.hardware [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.210 183087 DEBUG nova.virt.hardware [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.210 183087 DEBUG nova.virt.hardware [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.214 183087 DEBUG nova.virt.libvirt.vif [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:47:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1111872342',display_name='tempest-server-test-1111872342',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1111872342',id=25,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHR6EJ0M3L9T9j8F+dLuTeoyDdnWZCi8ic8NnDlm++GcyV15uFS8BCYLqsqkAatnDxNEdqPcffpohwDfIhW8BevW7ZqTxZGVxsZmagZkz7C/Tn9HOibVv/vjkHnqrvkfkA==',key_name='tempest-keypair-test-713690759',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4a559c36b13649d98b2995c099340eb9',ramdisk_id='',reservation_id='r-e1bdduph',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-508365101',owner_user_name='tempest-PortSecurityTest-508365101-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:47:30Z,user_data=None,user_id='52d582094c584036ba3e04c9da69ee02',uuid=addae953-8eb4-46ed-959d-3c2bb6b31ee3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "address": "fa:16:3e:90:46:e8", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09872a2c-36", "ovs_interfaceid": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.214 183087 DEBUG nova.network.os_vif_util [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converting VIF {"id": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "address": "fa:16:3e:90:46:e8", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09872a2c-36", "ovs_interfaceid": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.215 183087 DEBUG nova.network.os_vif_util [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:46:e8,bridge_name='br-int',has_traffic_filtering=True,id=09872a2c-368e-4dca-93d8-5fdd642b03b3,network=Network(9006b908-3439-4b0e-b89f-6a6dbb60f4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap09872a2c-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.216 183087 DEBUG nova.objects.instance [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lazy-loading 'pci_devices' on Instance uuid addae953-8eb4-46ed-959d-3c2bb6b31ee3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.305 183087 DEBUG nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] End _get_guest_xml xml=<domain type="kvm">
Jan 26 08:47:33 compute-1 nova_compute[183083]:   <uuid>addae953-8eb4-46ed-959d-3c2bb6b31ee3</uuid>
Jan 26 08:47:33 compute-1 nova_compute[183083]:   <name>instance-00000019</name>
Jan 26 08:47:33 compute-1 nova_compute[183083]:   <memory>131072</memory>
Jan 26 08:47:33 compute-1 nova_compute[183083]:   <vcpu>1</vcpu>
Jan 26 08:47:33 compute-1 nova_compute[183083]:   <metadata>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <nova:name>tempest-server-test-1111872342</nova:name>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <nova:creationTime>2026-01-26 08:47:33</nova:creationTime>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <nova:flavor name="m1.nano">
Jan 26 08:47:33 compute-1 nova_compute[183083]:         <nova:memory>128</nova:memory>
Jan 26 08:47:33 compute-1 nova_compute[183083]:         <nova:disk>1</nova:disk>
Jan 26 08:47:33 compute-1 nova_compute[183083]:         <nova:swap>0</nova:swap>
Jan 26 08:47:33 compute-1 nova_compute[183083]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 08:47:33 compute-1 nova_compute[183083]:         <nova:vcpus>1</nova:vcpus>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       </nova:flavor>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <nova:owner>
Jan 26 08:47:33 compute-1 nova_compute[183083]:         <nova:user uuid="52d582094c584036ba3e04c9da69ee02">tempest-PortSecurityTest-508365101-project-member</nova:user>
Jan 26 08:47:33 compute-1 nova_compute[183083]:         <nova:project uuid="4a559c36b13649d98b2995c099340eb9">tempest-PortSecurityTest-508365101</nova:project>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       </nova:owner>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <nova:root type="image" uuid="13d1a20a-8003-4f19-aba7-ccbd9eff9b82"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <nova:ports>
Jan 26 08:47:33 compute-1 nova_compute[183083]:         <nova:port uuid="09872a2c-368e-4dca-93d8-5fdd642b03b3">
Jan 26 08:47:33 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       </nova:ports>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     </nova:instance>
Jan 26 08:47:33 compute-1 nova_compute[183083]:   </metadata>
Jan 26 08:47:33 compute-1 nova_compute[183083]:   <sysinfo type="smbios">
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <system>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <entry name="manufacturer">RDO</entry>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <entry name="product">OpenStack Compute</entry>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <entry name="serial">addae953-8eb4-46ed-959d-3c2bb6b31ee3</entry>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <entry name="uuid">addae953-8eb4-46ed-959d-3c2bb6b31ee3</entry>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <entry name="family">Virtual Machine</entry>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     </system>
Jan 26 08:47:33 compute-1 nova_compute[183083]:   </sysinfo>
Jan 26 08:47:33 compute-1 nova_compute[183083]:   <os>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <boot dev="hd"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <smbios mode="sysinfo"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:   </os>
Jan 26 08:47:33 compute-1 nova_compute[183083]:   <features>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <acpi/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <apic/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <vmcoreinfo/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:   </features>
Jan 26 08:47:33 compute-1 nova_compute[183083]:   <clock offset="utc">
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <timer name="hpet" present="no"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:   </clock>
Jan 26 08:47:33 compute-1 nova_compute[183083]:   <cpu mode="host-model" match="exact">
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:   </cpu>
Jan 26 08:47:33 compute-1 nova_compute[183083]:   <devices>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <disk type="file" device="disk">
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <target dev="vda" bus="virtio"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <disk type="file" device="cdrom">
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.config"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <target dev="sda" bus="sata"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:90:46:e8"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <target dev="tap09872a2c-36"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <serial type="pty">
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <log file="/var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/console.log" append="off"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     </serial>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <video>
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     </video>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <input type="tablet" bus="usb"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <rng model="virtio">
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <backend model="random">/dev/urandom</backend>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     </rng>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <controller type="usb" index="0"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     <memballoon model="virtio">
Jan 26 08:47:33 compute-1 nova_compute[183083]:       <stats period="10"/>
Jan 26 08:47:33 compute-1 nova_compute[183083]:     </memballoon>
Jan 26 08:47:33 compute-1 nova_compute[183083]:   </devices>
Jan 26 08:47:33 compute-1 nova_compute[183083]: </domain>
Jan 26 08:47:33 compute-1 nova_compute[183083]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.307 183087 DEBUG nova.compute.manager [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Preparing to wait for external event network-vif-plugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.307 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.308 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.308 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.309 183087 DEBUG nova.virt.libvirt.vif [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:47:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1111872342',display_name='tempest-server-test-1111872342',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1111872342',id=25,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHR6EJ0M3L9T9j8F+dLuTeoyDdnWZCi8ic8NnDlm++GcyV15uFS8BCYLqsqkAatnDxNEdqPcffpohwDfIhW8BevW7ZqTxZGVxsZmagZkz7C/Tn9HOibVv/vjkHnqrvkfkA==',key_name='tempest-keypair-test-713690759',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4a559c36b13649d98b2995c099340eb9',ramdisk_id='',reservation_id='r-e1bdduph',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-508365101',owner_user_name='tempest-PortSecurityTest-508365101-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:47:30Z,user_data=None,user_id='52d582094c584036ba3e04c9da69ee02',uuid=addae953-8eb4-46ed-959d-3c2bb6b31ee3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "address": "fa:16:3e:90:46:e8", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09872a2c-36", "ovs_interfaceid": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.310 183087 DEBUG nova.network.os_vif_util [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converting VIF {"id": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "address": "fa:16:3e:90:46:e8", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09872a2c-36", "ovs_interfaceid": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.311 183087 DEBUG nova.network.os_vif_util [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:46:e8,bridge_name='br-int',has_traffic_filtering=True,id=09872a2c-368e-4dca-93d8-5fdd642b03b3,network=Network(9006b908-3439-4b0e-b89f-6a6dbb60f4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap09872a2c-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.311 183087 DEBUG os_vif [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:46:e8,bridge_name='br-int',has_traffic_filtering=True,id=09872a2c-368e-4dca-93d8-5fdd642b03b3,network=Network(9006b908-3439-4b0e-b89f-6a6dbb60f4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap09872a2c-36') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.313 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.313 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.314 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.317 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.317 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09872a2c-36, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.318 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09872a2c-36, col_values=(('external_ids', {'iface-id': '09872a2c-368e-4dca-93d8-5fdd642b03b3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:90:46:e8', 'vm-uuid': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.320 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:33 compute-1 NetworkManager[55451]: <info>  [1769417253.3218] manager: (tap09872a2c-36): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.325 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.327 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.328 183087 INFO os_vif [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:46:e8,bridge_name='br-int',has_traffic_filtering=True,id=09872a2c-368e-4dca-93d8-5fdd642b03b3,network=Network(9006b908-3439-4b0e-b89f-6a6dbb60f4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap09872a2c-36')
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.397 183087 DEBUG nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.399 183087 DEBUG nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.399 183087 DEBUG nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] No VIF found with MAC fa:16:3e:90:46:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.399 183087 INFO nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Using config drive
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.500 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.755 183087 INFO nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Creating config drive at /var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.config
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.765 183087 DEBUG oslo_concurrency.processutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpik5ao3yv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.898 183087 DEBUG oslo_concurrency.processutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpik5ao3yv" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:47:33 compute-1 kernel: tap09872a2c-36: entered promiscuous mode
Jan 26 08:47:33 compute-1 NetworkManager[55451]: <info>  [1769417253.9591] manager: (tap09872a2c-36): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Jan 26 08:47:33 compute-1 ovn_controller[95352]: 2026-01-26T08:47:33Z|00129|binding|INFO|Claiming lport 09872a2c-368e-4dca-93d8-5fdd642b03b3 for this chassis.
Jan 26 08:47:33 compute-1 ovn_controller[95352]: 2026-01-26T08:47:33Z|00130|binding|INFO|09872a2c-368e-4dca-93d8-5fdd642b03b3: Claiming fa:16:3e:90:46:e8 10.100.0.6
Jan 26 08:47:33 compute-1 ovn_controller[95352]: 2026-01-26T08:47:33Z|00131|binding|INFO|09872a2c-368e-4dca-93d8-5fdd642b03b3: Claiming unknown
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.961 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:33 compute-1 ovn_controller[95352]: 2026-01-26T08:47:33Z|00132|binding|INFO|Setting lport 09872a2c-368e-4dca-93d8-5fdd642b03b3 ovn-installed in OVS
Jan 26 08:47:33 compute-1 nova_compute[183083]: 2026-01-26 08:47:33.975 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:33 compute-1 ovn_controller[95352]: 2026-01-26T08:47:33Z|00133|binding|INFO|Setting lport 09872a2c-368e-4dca-93d8-5fdd642b03b3 up in Southbound
Jan 26 08:47:33 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:33.977 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:46:e8 10.100.0.6', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9006b908-3439-4b0e-b89f-6a6dbb60f4a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4a559c36b13649d98b2995c099340eb9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5229e7d1-fe13-4532-bccb-d60478a3e25e, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=09872a2c-368e-4dca-93d8-5fdd642b03b3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:47:33 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:33.978 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 09872a2c-368e-4dca-93d8-5fdd642b03b3 in datapath 9006b908-3439-4b0e-b89f-6a6dbb60f4a7 bound to our chassis
Jan 26 08:47:33 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:33.980 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9006b908-3439-4b0e-b89f-6a6dbb60f4a7
Jan 26 08:47:33 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:33.992 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[4446db5c-7eb1-4874-a402-1060a4a8325e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:33 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:33.993 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9006b908-31 in ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 08:47:33 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:33.994 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9006b908-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 08:47:33 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:33.995 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[7f63e9f9-a035-4ada-9083-297416b455a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:33 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:33.995 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[aed159c5-20b0-43eb-9d51-7eed8da9aa56]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:34 compute-1 systemd-machined[154360]: New machine qemu-7-instance-00000019.
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.013 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[03e18526-fa53-4405-8670-34ab444d490e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:34 compute-1 systemd[1]: Started Virtual Machine qemu-7-instance-00000019.
Jan 26 08:47:34 compute-1 systemd-udevd[214641]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.042 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[c736b1bd-7118-4470-ae68-4345b42f36ec]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:34 compute-1 NetworkManager[55451]: <info>  [1769417254.0529] device (tap09872a2c-36): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 08:47:34 compute-1 NetworkManager[55451]: <info>  [1769417254.0555] device (tap09872a2c-36): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.074 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[4dd43e3f-d805-4329-a23e-699a5acec9f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:34 compute-1 systemd-udevd[214644]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.084 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[caab5784-9ff2-4b26-870a-e3ba6a7e68cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:34 compute-1 NetworkManager[55451]: <info>  [1769417254.0860] manager: (tap9006b908-30): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.128 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[146089fe-b168-4490-9d11-bbae117e8a46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.131 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[1195f683-4ce9-4caf-9d9d-6ee5ceffe5a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:34 compute-1 NetworkManager[55451]: <info>  [1769417254.1676] device (tap9006b908-30): carrier: link connected
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.179 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[3ab883ed-c004-4529-a7e5-ecb92ba39523]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.208 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[42a0ce78-ad95-490d-8041-8633cacba26f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9006b908-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:9a:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 359477, 'reachable_time': 19080, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214673, 'error': None, 'target': 'ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.230 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[882e2b91-f7bd-4b39-b4f5-2a295b9e561b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6e:9aa7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 359477, 'tstamp': 359477}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214678, 'error': None, 'target': 'ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.253 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[030b633f-b791-4a21-9554-6f0648a69c74]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9006b908-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:9a:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 359477, 'reachable_time': 19080, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214679, 'error': None, 'target': 'ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.301 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[7e86e1e7-8b71-435a-8953-1fabe5935b5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:34 compute-1 nova_compute[183083]: 2026-01-26 08:47:34.327 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417254.327149, addae953-8eb4-46ed-959d-3c2bb6b31ee3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:47:34 compute-1 nova_compute[183083]: 2026-01-26 08:47:34.328 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] VM Started (Lifecycle Event)
Jan 26 08:47:34 compute-1 nova_compute[183083]: 2026-01-26 08:47:34.372 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.374 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[1b692a29-a14b-4cb4-aabd-fa0ebfbd20e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.376 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9006b908-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.377 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:47:34 compute-1 nova_compute[183083]: 2026-01-26 08:47:34.377 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417254.3274016, addae953-8eb4-46ed-959d-3c2bb6b31ee3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:47:34 compute-1 nova_compute[183083]: 2026-01-26 08:47:34.377 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] VM Paused (Lifecycle Event)
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.378 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9006b908-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:34 compute-1 nova_compute[183083]: 2026-01-26 08:47:34.381 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:34 compute-1 kernel: tap9006b908-30: entered promiscuous mode
Jan 26 08:47:34 compute-1 NetworkManager[55451]: <info>  [1769417254.3821] manager: (tap9006b908-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.385 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9006b908-30, col_values=(('external_ids', {'iface-id': '2e241887-a928-4a33-98c2-6228ee06108e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:34 compute-1 nova_compute[183083]: 2026-01-26 08:47:34.387 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:34 compute-1 ovn_controller[95352]: 2026-01-26T08:47:34Z|00134|binding|INFO|Releasing lport 2e241887-a928-4a33-98c2-6228ee06108e from this chassis (sb_readonly=0)
Jan 26 08:47:34 compute-1 nova_compute[183083]: 2026-01-26 08:47:34.410 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.411 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9006b908-3439-4b0e-b89f-6a6dbb60f4a7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9006b908-3439-4b0e-b89f-6a6dbb60f4a7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.412 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[179bed83-60a4-4a54-bd44-5f685731e9d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.413 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: global
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-9006b908-3439-4b0e-b89f-6a6dbb60f4a7
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/9006b908-3439-4b0e-b89f-6a6dbb60f4a7.pid.haproxy
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID 9006b908-3439-4b0e-b89f-6a6dbb60f4a7
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 08:47:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:34.414 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7', 'env', 'PROCESS_TAG=haproxy-9006b908-3439-4b0e-b89f-6a6dbb60f4a7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9006b908-3439-4b0e-b89f-6a6dbb60f4a7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 08:47:34 compute-1 nova_compute[183083]: 2026-01-26 08:47:34.428 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:47:34 compute-1 nova_compute[183083]: 2026-01-26 08:47:34.432 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:47:34 compute-1 nova_compute[183083]: 2026-01-26 08:47:34.485 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:47:34 compute-1 nova_compute[183083]: 2026-01-26 08:47:34.830 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:34 compute-1 podman[214711]: 2026-01-26 08:47:34.923976545 +0000 UTC m=+0.085594630 container create f2164f63ec818bde1faf58d87145d12dbde02e77399200edf955d590f4a15941 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 26 08:47:34 compute-1 podman[214711]: 2026-01-26 08:47:34.867817181 +0000 UTC m=+0.029435246 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 08:47:34 compute-1 systemd[1]: Started libpod-conmon-f2164f63ec818bde1faf58d87145d12dbde02e77399200edf955d590f4a15941.scope.
Jan 26 08:47:35 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:47:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/772ff4d0575ed08b431aa3de405bf114bfe559eaebd4b0cd36d0dbc19b43a528/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 08:47:35 compute-1 podman[214711]: 2026-01-26 08:47:35.02742387 +0000 UTC m=+0.189042005 container init f2164f63ec818bde1faf58d87145d12dbde02e77399200edf955d590f4a15941 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 26 08:47:35 compute-1 podman[214711]: 2026-01-26 08:47:35.036737244 +0000 UTC m=+0.198355329 container start f2164f63ec818bde1faf58d87145d12dbde02e77399200edf955d590f4a15941 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 08:47:35 compute-1 neutron-haproxy-ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7[214727]: [NOTICE]   (214731) : New worker (214733) forked
Jan 26 08:47:35 compute-1 neutron-haproxy-ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7[214727]: [NOTICE]   (214731) : Loading success.
Jan 26 08:47:35 compute-1 nova_compute[183083]: 2026-01-26 08:47:35.488 183087 DEBUG nova.compute.manager [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Received event network-changed-09872a2c-368e-4dca-93d8-5fdd642b03b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:47:35 compute-1 nova_compute[183083]: 2026-01-26 08:47:35.489 183087 DEBUG nova.compute.manager [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Refreshing instance network info cache due to event network-changed-09872a2c-368e-4dca-93d8-5fdd642b03b3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:47:35 compute-1 nova_compute[183083]: 2026-01-26 08:47:35.489 183087 DEBUG oslo_concurrency.lockutils [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-addae953-8eb4-46ed-959d-3c2bb6b31ee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:47:35 compute-1 nova_compute[183083]: 2026-01-26 08:47:35.489 183087 DEBUG oslo_concurrency.lockutils [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-addae953-8eb4-46ed-959d-3c2bb6b31ee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:47:35 compute-1 nova_compute[183083]: 2026-01-26 08:47:35.489 183087 DEBUG nova.network.neutron [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Refreshing network info cache for port 09872a2c-368e-4dca-93d8-5fdd642b03b3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.330 183087 DEBUG oslo_concurrency.lockutils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "a4c5d303-460c-4195-875b-4ca6d03f004d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.331 183087 DEBUG oslo_concurrency.lockutils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "a4c5d303-460c-4195-875b-4ca6d03f004d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.348 183087 DEBUG nova.compute.manager [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.446 183087 DEBUG oslo_concurrency.lockutils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.447 183087 DEBUG oslo_concurrency.lockutils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.457 183087 DEBUG nova.virt.hardware [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.458 183087 INFO nova.compute.claims [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.661 183087 DEBUG nova.compute.provider_tree [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.686 183087 DEBUG nova.scheduler.client.report [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.712 183087 DEBUG oslo_concurrency.lockutils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.713 183087 DEBUG nova.compute.manager [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.788 183087 DEBUG nova.compute.manager [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.789 183087 DEBUG nova.network.neutron [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.811 183087 INFO nova.virt.libvirt.driver [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.829 183087 DEBUG nova.compute.manager [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:47:36 compute-1 podman[214742]: 2026-01-26 08:47:36.83224272 +0000 UTC m=+0.087429381 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.921 183087 DEBUG nova.compute.manager [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.924 183087 DEBUG nova.virt.libvirt.driver [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.925 183087 INFO nova.virt.libvirt.driver [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Creating image(s)
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.926 183087 DEBUG oslo_concurrency.lockutils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "/var/lib/nova/instances/a4c5d303-460c-4195-875b-4ca6d03f004d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.927 183087 DEBUG oslo_concurrency.lockutils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "/var/lib/nova/instances/a4c5d303-460c-4195-875b-4ca6d03f004d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.928 183087 DEBUG oslo_concurrency.lockutils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "/var/lib/nova/instances/a4c5d303-460c-4195-875b-4ca6d03f004d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.929 183087 DEBUG oslo_concurrency.lockutils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:36 compute-1 nova_compute[183083]: 2026-01-26 08:47:36.929 183087 DEBUG oslo_concurrency.lockutils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.009 183087 DEBUG nova.policy [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e29b04a4a66d43aaa5e5c4f38eeb59c4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5c33bd5e85114c868a4e91d997a5ceec', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.784 183087 DEBUG oslo_concurrency.lockutils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.855s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Traceback (most recent call last):
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]     raise exception.ImageUnacceptable(
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] 
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] During handling of the above exception, another exception occurred:
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] 
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Traceback (most recent call last):
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]     yield resources
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]     created_disks = self._create_and_inject_local_root(
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]     image.cache(fetch_func=fetch_func,
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]     return f(*args, **kwargs)
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d]     raise exception.ImageUnacceptable(
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:47:37 compute-1 nova_compute[183083]: 2026-01-26 08:47:37.785 183087 ERROR nova.compute.manager [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] 
Jan 26 08:47:38 compute-1 nova_compute[183083]: 2026-01-26 08:47:38.322 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.018 183087 DEBUG nova.network.neutron [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Updated VIF entry in instance network info cache for port 09872a2c-368e-4dca-93d8-5fdd642b03b3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.019 183087 DEBUG nova.network.neutron [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Updating instance_info_cache with network_info: [{"id": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "address": "fa:16:3e:90:46:e8", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09872a2c-36", "ovs_interfaceid": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.072 183087 DEBUG oslo_concurrency.lockutils [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-addae953-8eb4-46ed-959d-3c2bb6b31ee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.073 183087 DEBUG nova.compute.manager [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Received event network-vif-plugged-2fe37e18-bc67-418d-b6bb-db4ed40f6645 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.074 183087 DEBUG oslo_concurrency.lockutils [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "5ea5b3e6-d6ee-4984-b938-32f34a5c3307-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.074 183087 DEBUG oslo_concurrency.lockutils [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "5ea5b3e6-d6ee-4984-b938-32f34a5c3307-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.075 183087 DEBUG oslo_concurrency.lockutils [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "5ea5b3e6-d6ee-4984-b938-32f34a5c3307-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.075 183087 DEBUG nova.compute.manager [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Processing event network-vif-plugged-2fe37e18-bc67-418d-b6bb-db4ed40f6645 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.075 183087 DEBUG nova.compute.manager [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Received event network-vif-plugged-2fe37e18-bc67-418d-b6bb-db4ed40f6645 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.076 183087 DEBUG oslo_concurrency.lockutils [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "5ea5b3e6-d6ee-4984-b938-32f34a5c3307-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.076 183087 DEBUG oslo_concurrency.lockutils [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "5ea5b3e6-d6ee-4984-b938-32f34a5c3307-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.077 183087 DEBUG oslo_concurrency.lockutils [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "5ea5b3e6-d6ee-4984-b938-32f34a5c3307-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.077 183087 DEBUG nova.compute.manager [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] No waiting events found dispatching network-vif-plugged-2fe37e18-bc67-418d-b6bb-db4ed40f6645 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.078 183087 WARNING nova.compute.manager [req-d78bbf7a-e8a1-4e12-b2d0-31535cf57c17 req-cbeb653c-a603-4e14-8fef-de76f1f3b349 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Received unexpected event network-vif-plugged-2fe37e18-bc67-418d-b6bb-db4ed40f6645 for instance with vm_state building and task_state spawning.
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.079 183087 DEBUG nova.compute.manager [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Instance event wait completed in 12 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.086 183087 DEBUG nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.087 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417259.0850358, 5ea5b3e6-d6ee-4984-b938-32f34a5c3307 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.087 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] VM Resumed (Lifecycle Event)
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.094 183087 INFO nova.virt.libvirt.driver [-] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Instance spawned successfully.
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.095 183087 DEBUG nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.127 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.136 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.141 183087 DEBUG nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.141 183087 DEBUG nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.142 183087 DEBUG nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.143 183087 DEBUG nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.144 183087 DEBUG nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.144 183087 DEBUG nova.virt.libvirt.driver [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.243 183087 DEBUG nova.network.neutron [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Successfully created port: f3dfea44-5bb8-445b-9451-6a15a4b781ad _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.269 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.309 183087 INFO nova.compute.manager [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Took 17.02 seconds to spawn the instance on the hypervisor.
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.309 183087 DEBUG nova.compute.manager [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.450 183087 INFO nova.compute.manager [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Took 17.59 seconds to build instance.
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.482 183087 DEBUG oslo_concurrency.lockutils [None req-69469f2a-f2a9-439c-a40c-827a72e6ee5a 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lock "5ea5b3e6-d6ee-4984-b938-32f34a5c3307" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.703s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:39 compute-1 nova_compute[183083]: 2026-01-26 08:47:39.833 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:40 compute-1 nova_compute[183083]: 2026-01-26 08:47:40.058 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:40.059 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:47:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:40.061 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:47:40 compute-1 nova_compute[183083]: 2026-01-26 08:47:40.715 183087 INFO nova.compute.manager [None req-7b5e0a8e-5bdf-42ad-86fa-5205e13c061c 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Get console output
Jan 26 08:47:40 compute-1 nova_compute[183083]: 2026-01-26 08:47:40.844 183087 DEBUG nova.network.neutron [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Successfully updated port: f3dfea44-5bb8-445b-9451-6a15a4b781ad _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:47:40 compute-1 nova_compute[183083]: 2026-01-26 08:47:40.866 183087 DEBUG oslo_concurrency.lockutils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "refresh_cache-a4c5d303-460c-4195-875b-4ca6d03f004d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:47:40 compute-1 nova_compute[183083]: 2026-01-26 08:47:40.866 183087 DEBUG oslo_concurrency.lockutils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquired lock "refresh_cache-a4c5d303-460c-4195-875b-4ca6d03f004d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:47:40 compute-1 nova_compute[183083]: 2026-01-26 08:47:40.867 183087 DEBUG nova.network.neutron [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:47:41 compute-1 nova_compute[183083]: 2026-01-26 08:47:41.124 183087 DEBUG nova.network.neutron [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:47:41 compute-1 nova_compute[183083]: 2026-01-26 08:47:41.330 183087 DEBUG nova.compute.manager [req-860ee6b3-eba3-4656-96ee-495c6fbbedc8 req-858b9aaa-9772-4e7d-a318-a552a6153f5e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Received event network-changed-f3dfea44-5bb8-445b-9451-6a15a4b781ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:47:41 compute-1 nova_compute[183083]: 2026-01-26 08:47:41.331 183087 DEBUG nova.compute.manager [req-860ee6b3-eba3-4656-96ee-495c6fbbedc8 req-858b9aaa-9772-4e7d-a318-a552a6153f5e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Refreshing instance network info cache due to event network-changed-f3dfea44-5bb8-445b-9451-6a15a4b781ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:47:41 compute-1 nova_compute[183083]: 2026-01-26 08:47:41.331 183087 DEBUG oslo_concurrency.lockutils [req-860ee6b3-eba3-4656-96ee-495c6fbbedc8 req-858b9aaa-9772-4e7d-a318-a552a6153f5e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-a4c5d303-460c-4195-875b-4ca6d03f004d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:47:41 compute-1 nova_compute[183083]: 2026-01-26 08:47:41.981 183087 DEBUG nova.network.neutron [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Updating instance_info_cache with network_info: [{"id": "f3dfea44-5bb8-445b-9451-6a15a4b781ad", "address": "fa:16:3e:42:ad:76", "network": {"id": "ada33edd-017f-4cf5-bc2e-db0c66214549", "bridge": "br-int", "label": "tempest-test-network--1775908954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3dfea44-5b", "ovs_interfaceid": "f3dfea44-5bb8-445b-9451-6a15a4b781ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.002 183087 DEBUG oslo_concurrency.lockutils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Releasing lock "refresh_cache-a4c5d303-460c-4195-875b-4ca6d03f004d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.003 183087 DEBUG nova.compute.manager [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Instance network_info: |[{"id": "f3dfea44-5bb8-445b-9451-6a15a4b781ad", "address": "fa:16:3e:42:ad:76", "network": {"id": "ada33edd-017f-4cf5-bc2e-db0c66214549", "bridge": "br-int", "label": "tempest-test-network--1775908954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3dfea44-5b", "ovs_interfaceid": "f3dfea44-5bb8-445b-9451-6a15a4b781ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.003 183087 DEBUG oslo_concurrency.lockutils [req-860ee6b3-eba3-4656-96ee-495c6fbbedc8 req-858b9aaa-9772-4e7d-a318-a552a6153f5e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-a4c5d303-460c-4195-875b-4ca6d03f004d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.004 183087 DEBUG nova.network.neutron [req-860ee6b3-eba3-4656-96ee-495c6fbbedc8 req-858b9aaa-9772-4e7d-a318-a552a6153f5e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Refreshing network info cache for port f3dfea44-5bb8-445b-9451-6a15a4b781ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.005 183087 INFO nova.compute.manager [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Terminating instance
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.007 183087 DEBUG nova.compute.manager [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.012 183087 DEBUG nova.virt.libvirt.driver [-] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.012 183087 INFO nova.virt.libvirt.driver [-] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Instance destroyed successfully.
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.013 183087 DEBUG nova.virt.libvirt.vif [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:47:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_multicast_after_idle_timeout-1540838637',display_name='tempest-test_multicast_after_idle_timeout-1540838637',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-multicast-after-idle-timeout-1540838637',id=28,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJOJsa8zDc5tOBfBRLm0Qi812u7HVOO6E7MXAGpKZ4/7op/PkClYPLhHmkNgkuxZ09O67J1SPnUd6CxZdN+euoFRi16VYgHaqYzJwS9D1WUns9/BQk7M5SX/0drgiuby9w==',key_name='tempest-keypair-test-1151010750',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c33bd5e85114c868a4e91d997a5ceec',ramdisk_id='',reservation_id='r-yubjktxt',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-635971062',owner_user_name='tempest-MulticastTestIPv4Ovn-635971062-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:47:36Z,user_data=None,user_id='e29b04a4a66d43aaa5e5c4f38eeb59c4',uuid=a4c5d303-460c-4195-875b-4ca6d03f004d,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f3dfea44-5bb8-445b-9451-6a15a4b781ad", "address": "fa:16:3e:42:ad:76", "network": {"id": "ada33edd-017f-4cf5-bc2e-db0c66214549", "bridge": "br-int", "label": "tempest-test-network--1775908954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3dfea44-5b", "ovs_interfaceid": "f3dfea44-5bb8-445b-9451-6a15a4b781ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.014 183087 DEBUG nova.network.os_vif_util [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Converting VIF {"id": "f3dfea44-5bb8-445b-9451-6a15a4b781ad", "address": "fa:16:3e:42:ad:76", "network": {"id": "ada33edd-017f-4cf5-bc2e-db0c66214549", "bridge": "br-int", "label": "tempest-test-network--1775908954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3dfea44-5b", "ovs_interfaceid": "f3dfea44-5bb8-445b-9451-6a15a4b781ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.015 183087 DEBUG nova.network.os_vif_util [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:ad:76,bridge_name='br-int',has_traffic_filtering=True,id=f3dfea44-5bb8-445b-9451-6a15a4b781ad,network=Network(ada33edd-017f-4cf5-bc2e-db0c66214549),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3dfea44-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.016 183087 DEBUG os_vif [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:ad:76,bridge_name='br-int',has_traffic_filtering=True,id=f3dfea44-5bb8-445b-9451-6a15a4b781ad,network=Network(ada33edd-017f-4cf5-bc2e-db0c66214549),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3dfea44-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.018 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.018 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf3dfea44-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.019 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.027 183087 INFO os_vif [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:ad:76,bridge_name='br-int',has_traffic_filtering=True,id=f3dfea44-5bb8-445b-9451-6a15a4b781ad,network=Network(ada33edd-017f-4cf5-bc2e-db0c66214549),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3dfea44-5b')
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.028 183087 INFO nova.virt.libvirt.driver [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Deleting instance files /var/lib/nova/instances/a4c5d303-460c-4195-875b-4ca6d03f004d_del
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.029 183087 INFO nova.virt.libvirt.driver [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Deletion of /var/lib/nova/instances/a4c5d303-460c-4195-875b-4ca6d03f004d_del complete
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.106 183087 INFO nova.compute.manager [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Took 0.10 seconds to destroy the instance on the hypervisor.
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.108 183087 DEBUG nova.compute.claims [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c98319dc0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.108 183087 DEBUG oslo_concurrency.lockutils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.109 183087 DEBUG oslo_concurrency.lockutils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.245 183087 DEBUG nova.compute.provider_tree [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.262 183087 DEBUG nova.scheduler.client.report [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.285 183087 DEBUG oslo_concurrency.lockutils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.286 183087 DEBUG nova.compute.utils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.287 183087 ERROR nova.compute.manager [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Build of instance a4c5d303-460c-4195-875b-4ca6d03f004d aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance a4c5d303-460c-4195-875b-4ca6d03f004d aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.287 183087 DEBUG nova.compute.manager [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.288 183087 DEBUG nova.virt.libvirt.vif [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:47:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_multicast_after_idle_timeout-1540838637',display_name='tempest-test_multicast_after_idle_timeout-1540838637',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-test-multicast-after-idle-timeout-1540838637',id=28,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJOJsa8zDc5tOBfBRLm0Qi812u7HVOO6E7MXAGpKZ4/7op/PkClYPLhHmkNgkuxZ09O67J1SPnUd6CxZdN+euoFRi16VYgHaqYzJwS9D1WUns9/BQk7M5SX/0drgiuby9w==',key_name='tempest-keypair-test-1151010750',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c33bd5e85114c868a4e91d997a5ceec',ramdisk_id='',reservation_id='r-yubjktxt',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-635971062',owner_user_name='tempest-MulticastTestIPv4Ovn-635971062-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:47:42Z,user_data=None,user_id='e29b04a4a66d43aaa5e5c4f38eeb59c4',uuid=a4c5d303-460c-4195-875b-4ca6d03f004d,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f3dfea44-5bb8-445b-9451-6a15a4b781ad", "address": "fa:16:3e:42:ad:76", "network": {"id": "ada33edd-017f-4cf5-bc2e-db0c66214549", "bridge": "br-int", "label": "tempest-test-network--1775908954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3dfea44-5b", "ovs_interfaceid": "f3dfea44-5bb8-445b-9451-6a15a4b781ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.289 183087 DEBUG nova.network.os_vif_util [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Converting VIF {"id": "f3dfea44-5bb8-445b-9451-6a15a4b781ad", "address": "fa:16:3e:42:ad:76", "network": {"id": "ada33edd-017f-4cf5-bc2e-db0c66214549", "bridge": "br-int", "label": "tempest-test-network--1775908954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3dfea44-5b", "ovs_interfaceid": "f3dfea44-5bb8-445b-9451-6a15a4b781ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.289 183087 DEBUG nova.network.os_vif_util [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:ad:76,bridge_name='br-int',has_traffic_filtering=True,id=f3dfea44-5bb8-445b-9451-6a15a4b781ad,network=Network(ada33edd-017f-4cf5-bc2e-db0c66214549),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3dfea44-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.290 183087 DEBUG os_vif [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:ad:76,bridge_name='br-int',has_traffic_filtering=True,id=f3dfea44-5bb8-445b-9451-6a15a4b781ad,network=Network(ada33edd-017f-4cf5-bc2e-db0c66214549),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3dfea44-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.291 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.291 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf3dfea44-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.292 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.294 183087 INFO os_vif [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:ad:76,bridge_name='br-int',has_traffic_filtering=True,id=f3dfea44-5bb8-445b-9451-6a15a4b781ad,network=Network(ada33edd-017f-4cf5-bc2e-db0c66214549),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3dfea44-5b')
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.295 183087 DEBUG nova.compute.manager [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.295 183087 DEBUG nova.compute.manager [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:47:42 compute-1 nova_compute[183083]: 2026-01-26 08:47:42.295 183087 DEBUG nova.network.neutron [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:47:43 compute-1 sshd-session[214766]: Connection closed by authenticating user root 159.223.236.81 port 36196 [preauth]
Jan 26 08:47:43 compute-1 nova_compute[183083]: 2026-01-26 08:47:43.325 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:43 compute-1 nova_compute[183083]: 2026-01-26 08:47:43.863 183087 DEBUG nova.network.neutron [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:43 compute-1 nova_compute[183083]: 2026-01-26 08:47:43.899 183087 DEBUG nova.network.neutron [req-860ee6b3-eba3-4656-96ee-495c6fbbedc8 req-858b9aaa-9772-4e7d-a318-a552a6153f5e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Updated VIF entry in instance network info cache for port f3dfea44-5bb8-445b-9451-6a15a4b781ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:47:43 compute-1 nova_compute[183083]: 2026-01-26 08:47:43.899 183087 DEBUG nova.network.neutron [req-860ee6b3-eba3-4656-96ee-495c6fbbedc8 req-858b9aaa-9772-4e7d-a318-a552a6153f5e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Updating instance_info_cache with network_info: [{"id": "f3dfea44-5bb8-445b-9451-6a15a4b781ad", "address": "fa:16:3e:42:ad:76", "network": {"id": "ada33edd-017f-4cf5-bc2e-db0c66214549", "bridge": "br-int", "label": "tempest-test-network--1775908954", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3dfea44-5b", "ovs_interfaceid": "f3dfea44-5bb8-445b-9451-6a15a4b781ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:43 compute-1 nova_compute[183083]: 2026-01-26 08:47:43.938 183087 INFO nova.compute.manager [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: a4c5d303-460c-4195-875b-4ca6d03f004d] Took 1.64 seconds to deallocate network for instance.
Jan 26 08:47:43 compute-1 nova_compute[183083]: 2026-01-26 08:47:43.961 183087 DEBUG oslo_concurrency.lockutils [req-860ee6b3-eba3-4656-96ee-495c6fbbedc8 req-858b9aaa-9772-4e7d-a318-a552a6153f5e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-a4c5d303-460c-4195-875b-4ca6d03f004d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:47:44 compute-1 nova_compute[183083]: 2026-01-26 08:47:44.089 183087 INFO nova.scheduler.client.report [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Deleted allocations for instance a4c5d303-460c-4195-875b-4ca6d03f004d
Jan 26 08:47:44 compute-1 nova_compute[183083]: 2026-01-26 08:47:44.090 183087 DEBUG oslo_concurrency.lockutils [None req-453b8d02-5517-4dc2-b318-72bd77be1106 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "a4c5d303-460c-4195-875b-4ca6d03f004d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:44 compute-1 nova_compute[183083]: 2026-01-26 08:47:44.869 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:45 compute-1 nova_compute[183083]: 2026-01-26 08:47:45.816 183087 INFO nova.compute.manager [None req-af159653-11ca-4404-968f-6bf81a744c53 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Get console output
Jan 26 08:47:45 compute-1 nova_compute[183083]: 2026-01-26 08:47:45.822 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:47:47 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:47.062 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.098 183087 DEBUG nova.compute.manager [req-7bfec59c-86a8-431d-836d-5fbe660ed37a req-0a74153b-b8ae-4019-a963-00c4d18099d5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Received event network-vif-plugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.098 183087 DEBUG oslo_concurrency.lockutils [req-7bfec59c-86a8-431d-836d-5fbe660ed37a req-0a74153b-b8ae-4019-a963-00c4d18099d5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.099 183087 DEBUG oslo_concurrency.lockutils [req-7bfec59c-86a8-431d-836d-5fbe660ed37a req-0a74153b-b8ae-4019-a963-00c4d18099d5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.100 183087 DEBUG oslo_concurrency.lockutils [req-7bfec59c-86a8-431d-836d-5fbe660ed37a req-0a74153b-b8ae-4019-a963-00c4d18099d5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.100 183087 DEBUG nova.compute.manager [req-7bfec59c-86a8-431d-836d-5fbe660ed37a req-0a74153b-b8ae-4019-a963-00c4d18099d5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Processing event network-vif-plugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.101 183087 DEBUG nova.compute.manager [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Instance event wait completed in 12 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.107 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417267.106878, addae953-8eb4-46ed-959d-3c2bb6b31ee3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.107 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] VM Resumed (Lifecycle Event)
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.125 183087 DEBUG nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.132 183087 INFO nova.virt.libvirt.driver [-] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Instance spawned successfully.
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.132 183087 DEBUG nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.144 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.148 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.179 183087 DEBUG nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.179 183087 DEBUG nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.180 183087 DEBUG nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.180 183087 DEBUG nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.180 183087 DEBUG nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.181 183087 DEBUG nova.virt.libvirt.driver [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.184 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.321 183087 INFO nova.compute.manager [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Took 16.57 seconds to spawn the instance on the hypervisor.
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.321 183087 DEBUG nova.compute.manager [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.421 183087 INFO nova.compute.manager [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Took 17.10 seconds to build instance.
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.436 183087 DEBUG oslo_concurrency.lockutils [None req-5f29fb36-3596-4aad-bd46-bee88741634b 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.997 183087 DEBUG oslo_concurrency.lockutils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquiring lock "c77e8014-2e4b-48db-9cc2-dd37ccd4d528" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:47 compute-1 nova_compute[183083]: 2026-01-26 08:47:47.997 183087 DEBUG oslo_concurrency.lockutils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "c77e8014-2e4b-48db-9cc2-dd37ccd4d528" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.027 183087 DEBUG nova.compute.manager [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.114 183087 DEBUG oslo_concurrency.lockutils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.115 183087 DEBUG oslo_concurrency.lockutils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.121 183087 DEBUG nova.virt.hardware [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.121 183087 INFO nova.compute.claims [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.256 183087 DEBUG nova.compute.provider_tree [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.270 183087 DEBUG nova.scheduler.client.report [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.299 183087 DEBUG oslo_concurrency.lockutils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.300 183087 DEBUG nova.compute.manager [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.327 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.362 183087 DEBUG nova.compute.manager [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.362 183087 DEBUG nova.network.neutron [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.414 183087 INFO nova.virt.libvirt.driver [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.496 183087 DEBUG nova.compute.manager [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.662 183087 DEBUG nova.compute.manager [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.668 183087 DEBUG nova.virt.libvirt.driver [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.668 183087 INFO nova.virt.libvirt.driver [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Creating image(s)
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.669 183087 DEBUG oslo_concurrency.lockutils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquiring lock "/var/lib/nova/instances/c77e8014-2e4b-48db-9cc2-dd37ccd4d528/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.669 183087 DEBUG oslo_concurrency.lockutils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "/var/lib/nova/instances/c77e8014-2e4b-48db-9cc2-dd37ccd4d528/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.670 183087 DEBUG oslo_concurrency.lockutils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "/var/lib/nova/instances/c77e8014-2e4b-48db-9cc2-dd37ccd4d528/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.670 183087 DEBUG oslo_concurrency.lockutils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:48 compute-1 nova_compute[183083]: 2026-01-26 08:47:48.671 183087 DEBUG oslo_concurrency.lockutils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.318 183087 DEBUG nova.compute.manager [req-85090252-a5a5-4b4c-a7ab-a6ddb3b46726 req-ed42fe9b-224d-48cf-870a-5b33eff741a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Received event network-vif-plugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.318 183087 DEBUG oslo_concurrency.lockutils [req-85090252-a5a5-4b4c-a7ab-a6ddb3b46726 req-ed42fe9b-224d-48cf-870a-5b33eff741a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.319 183087 DEBUG oslo_concurrency.lockutils [req-85090252-a5a5-4b4c-a7ab-a6ddb3b46726 req-ed42fe9b-224d-48cf-870a-5b33eff741a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.319 183087 DEBUG oslo_concurrency.lockutils [req-85090252-a5a5-4b4c-a7ab-a6ddb3b46726 req-ed42fe9b-224d-48cf-870a-5b33eff741a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.320 183087 DEBUG nova.compute.manager [req-85090252-a5a5-4b4c-a7ab-a6ddb3b46726 req-ed42fe9b-224d-48cf-870a-5b33eff741a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] No waiting events found dispatching network-vif-plugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.320 183087 WARNING nova.compute.manager [req-85090252-a5a5-4b4c-a7ab-a6ddb3b46726 req-ed42fe9b-224d-48cf-870a-5b33eff741a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Received unexpected event network-vif-plugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 for instance with vm_state active and task_state None.
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.427 183087 INFO nova.compute.manager [None req-1842cd70-6b00-4232-8c14-1d47d67e2078 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Get console output
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.474 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:47:49 compute-1 podman[214780]: 2026-01-26 08:47:49.808432118 +0000 UTC m=+0.075388860 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 26 08:47:49 compute-1 podman[214781]: 2026-01-26 08:47:49.833125799 +0000 UTC m=+0.088346978 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, name=ubi9-minimal)
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.871 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.955 183087 DEBUG oslo_concurrency.lockutils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Traceback (most recent call last):
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]     raise exception.ImageUnacceptable(
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] 
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] During handling of the above exception, another exception occurred:
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] 
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Traceback (most recent call last):
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]     yield resources
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]     created_disks = self._create_and_inject_local_root(
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]     image.cache(fetch_func=fetch_func,
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]     return f(*args, **kwargs)
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528]     raise exception.ImageUnacceptable(
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:47:49 compute-1 nova_compute[183083]: 2026-01-26 08:47:49.956 183087 ERROR nova.compute.manager [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] 
Jan 26 08:47:50 compute-1 nova_compute[183083]: 2026-01-26 08:47:50.182 183087 DEBUG nova.policy [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '10286fb05759476d8b51274239727a28', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9680eb9addb04171b834f7dff8ec602d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:47:50 compute-1 ovn_controller[95352]: 2026-01-26T08:47:50Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:80:c5:e2 10.100.0.3
Jan 26 08:47:50 compute-1 ovn_controller[95352]: 2026-01-26T08:47:50Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:80:c5:e2 10.100.0.3
Jan 26 08:47:51 compute-1 nova_compute[183083]: 2026-01-26 08:47:51.016 183087 INFO nova.compute.manager [None req-6eaac47a-fec3-4a0b-ad3e-90d327270466 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Get console output
Jan 26 08:47:51 compute-1 nova_compute[183083]: 2026-01-26 08:47:51.022 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:47:52 compute-1 nova_compute[183083]: 2026-01-26 08:47:52.318 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:52 compute-1 nova_compute[183083]: 2026-01-26 08:47:52.818 183087 DEBUG nova.network.neutron [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Successfully updated port: c49ab3d2-d1b7-4c99-b519-5b353e23bed5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:47:53 compute-1 nova_compute[183083]: 2026-01-26 08:47:53.102 183087 DEBUG nova.compute.manager [req-18943c69-ba0d-4151-aa1d-a3bdff7f798b req-bc4210c3-4326-4adb-89c2-f427efad0544 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Received event network-changed-c49ab3d2-d1b7-4c99-b519-5b353e23bed5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:47:53 compute-1 nova_compute[183083]: 2026-01-26 08:47:53.103 183087 DEBUG nova.compute.manager [req-18943c69-ba0d-4151-aa1d-a3bdff7f798b req-bc4210c3-4326-4adb-89c2-f427efad0544 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Refreshing instance network info cache due to event network-changed-c49ab3d2-d1b7-4c99-b519-5b353e23bed5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:47:53 compute-1 nova_compute[183083]: 2026-01-26 08:47:53.103 183087 DEBUG oslo_concurrency.lockutils [req-18943c69-ba0d-4151-aa1d-a3bdff7f798b req-bc4210c3-4326-4adb-89c2-f427efad0544 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-c77e8014-2e4b-48db-9cc2-dd37ccd4d528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:47:53 compute-1 nova_compute[183083]: 2026-01-26 08:47:53.103 183087 DEBUG oslo_concurrency.lockutils [req-18943c69-ba0d-4151-aa1d-a3bdff7f798b req-bc4210c3-4326-4adb-89c2-f427efad0544 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-c77e8014-2e4b-48db-9cc2-dd37ccd4d528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:47:53 compute-1 nova_compute[183083]: 2026-01-26 08:47:53.104 183087 DEBUG nova.network.neutron [req-18943c69-ba0d-4151-aa1d-a3bdff7f798b req-bc4210c3-4326-4adb-89c2-f427efad0544 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Refreshing network info cache for port c49ab3d2-d1b7-4c99-b519-5b353e23bed5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:47:53 compute-1 nova_compute[183083]: 2026-01-26 08:47:53.339 183087 DEBUG nova.network.neutron [req-18943c69-ba0d-4151-aa1d-a3bdff7f798b req-bc4210c3-4326-4adb-89c2-f427efad0544 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:47:53 compute-1 nova_compute[183083]: 2026-01-26 08:47:53.360 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:53 compute-1 nova_compute[183083]: 2026-01-26 08:47:53.639 183087 DEBUG nova.network.neutron [req-18943c69-ba0d-4151-aa1d-a3bdff7f798b req-bc4210c3-4326-4adb-89c2-f427efad0544 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:53 compute-1 nova_compute[183083]: 2026-01-26 08:47:53.658 183087 DEBUG oslo_concurrency.lockutils [req-18943c69-ba0d-4151-aa1d-a3bdff7f798b req-bc4210c3-4326-4adb-89c2-f427efad0544 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-c77e8014-2e4b-48db-9cc2-dd37ccd4d528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:47:53 compute-1 ovn_controller[95352]: 2026-01-26T08:47:53Z|00135|binding|INFO|Releasing lport 69612433-eca7-46c4-8771-8fe67d0df630 from this chassis (sb_readonly=0)
Jan 26 08:47:53 compute-1 ovn_controller[95352]: 2026-01-26T08:47:53Z|00136|binding|INFO|Releasing lport 2e241887-a928-4a33-98c2-6228ee06108e from this chassis (sb_readonly=0)
Jan 26 08:47:53 compute-1 nova_compute[183083]: 2026-01-26 08:47:53.910 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:54 compute-1 nova_compute[183083]: 2026-01-26 08:47:54.872 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:54 compute-1 nova_compute[183083]: 2026-01-26 08:47:54.933 183087 INFO nova.compute.manager [None req-a7cd7344-7e91-4e83-bab4-123190b7c820 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Get console output
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.009 183087 DEBUG nova.network.neutron [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Successfully updated port: 0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.021 183087 DEBUG oslo_concurrency.lockutils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquiring lock "refresh_cache-c77e8014-2e4b-48db-9cc2-dd37ccd4d528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.021 183087 DEBUG oslo_concurrency.lockutils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquired lock "refresh_cache-c77e8014-2e4b-48db-9cc2-dd37ccd4d528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.021 183087 DEBUG nova.network.neutron [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.259 183087 DEBUG nova.network.neutron [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.286 183087 DEBUG nova.compute.manager [req-3a315de6-0cce-45a3-9c3d-6e6ebee5d5e9 req-7ce2600c-e20c-4787-ac2a-972f64a15662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Received event network-changed-2fe37e18-bc67-418d-b6bb-db4ed40f6645 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.286 183087 DEBUG nova.compute.manager [req-3a315de6-0cce-45a3-9c3d-6e6ebee5d5e9 req-7ce2600c-e20c-4787-ac2a-972f64a15662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Refreshing instance network info cache due to event network-changed-2fe37e18-bc67-418d-b6bb-db4ed40f6645. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.287 183087 DEBUG oslo_concurrency.lockutils [req-3a315de6-0cce-45a3-9c3d-6e6ebee5d5e9 req-7ce2600c-e20c-4787-ac2a-972f64a15662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-5ea5b3e6-d6ee-4984-b938-32f34a5c3307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.287 183087 DEBUG oslo_concurrency.lockutils [req-3a315de6-0cce-45a3-9c3d-6e6ebee5d5e9 req-7ce2600c-e20c-4787-ac2a-972f64a15662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-5ea5b3e6-d6ee-4984-b938-32f34a5c3307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.288 183087 DEBUG nova.network.neutron [req-3a315de6-0cce-45a3-9c3d-6e6ebee5d5e9 req-7ce2600c-e20c-4787-ac2a-972f64a15662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Refreshing network info cache for port 2fe37e18-bc67-418d-b6bb-db4ed40f6645 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.675 183087 DEBUG oslo_concurrency.lockutils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "162ac758-98af-43d0-ac84-2396124f1469" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.676 183087 DEBUG oslo_concurrency.lockutils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "162ac758-98af-43d0-ac84-2396124f1469" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.703 183087 DEBUG nova.compute.manager [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.793 183087 DEBUG oslo_concurrency.lockutils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.794 183087 DEBUG oslo_concurrency.lockutils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.798 183087 DEBUG nova.virt.hardware [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.799 183087 INFO nova.compute.claims [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:47:55 compute-1 podman[214820]: 2026-01-26 08:47:55.815021426 +0000 UTC m=+0.075648518 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 08:47:55 compute-1 podman[214819]: 2026-01-26 08:47:55.857134481 +0000 UTC m=+0.113826871 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.970 183087 DEBUG nova.compute.provider_tree [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:47:55 compute-1 nova_compute[183083]: 2026-01-26 08:47:55.987 183087 DEBUG nova.scheduler.client.report [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:47:56 compute-1 nova_compute[183083]: 2026-01-26 08:47:56.008 183087 DEBUG oslo_concurrency.lockutils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:56 compute-1 nova_compute[183083]: 2026-01-26 08:47:56.008 183087 DEBUG nova.compute.manager [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:47:56 compute-1 nova_compute[183083]: 2026-01-26 08:47:56.077 183087 DEBUG nova.compute.manager [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:47:56 compute-1 nova_compute[183083]: 2026-01-26 08:47:56.078 183087 DEBUG nova.network.neutron [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:47:56 compute-1 nova_compute[183083]: 2026-01-26 08:47:56.102 183087 INFO nova.virt.libvirt.driver [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:47:56 compute-1 nova_compute[183083]: 2026-01-26 08:47:56.127 183087 DEBUG nova.compute.manager [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:47:56 compute-1 nova_compute[183083]: 2026-01-26 08:47:56.274 183087 DEBUG nova.compute.manager [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:47:56 compute-1 nova_compute[183083]: 2026-01-26 08:47:56.276 183087 DEBUG nova.virt.libvirt.driver [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:47:56 compute-1 nova_compute[183083]: 2026-01-26 08:47:56.277 183087 INFO nova.virt.libvirt.driver [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Creating image(s)
Jan 26 08:47:56 compute-1 nova_compute[183083]: 2026-01-26 08:47:56.278 183087 DEBUG oslo_concurrency.lockutils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "/var/lib/nova/instances/162ac758-98af-43d0-ac84-2396124f1469/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:56 compute-1 nova_compute[183083]: 2026-01-26 08:47:56.278 183087 DEBUG oslo_concurrency.lockutils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "/var/lib/nova/instances/162ac758-98af-43d0-ac84-2396124f1469/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:56 compute-1 nova_compute[183083]: 2026-01-26 08:47:56.280 183087 DEBUG oslo_concurrency.lockutils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "/var/lib/nova/instances/162ac758-98af-43d0-ac84-2396124f1469/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:56 compute-1 nova_compute[183083]: 2026-01-26 08:47:56.280 183087 DEBUG oslo_concurrency.lockutils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:56 compute-1 nova_compute[183083]: 2026-01-26 08:47:56.281 183087 DEBUG oslo_concurrency.lockutils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:56 compute-1 nova_compute[183083]: 2026-01-26 08:47:56.436 183087 DEBUG nova.policy [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a7abeebb4e4d469c91e6cee77f6be1c3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b71ae2b9d2fd454b8b3b9aa1a0e5c7e4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:47:56 compute-1 nova_compute[183083]: 2026-01-26 08:47:56.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.069 183087 DEBUG oslo_concurrency.lockutils [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Acquiring lock "5ea5b3e6-d6ee-4984-b938-32f34a5c3307" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.069 183087 DEBUG oslo_concurrency.lockutils [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lock "5ea5b3e6-d6ee-4984-b938-32f34a5c3307" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.070 183087 DEBUG oslo_concurrency.lockutils [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Acquiring lock "5ea5b3e6-d6ee-4984-b938-32f34a5c3307-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.070 183087 DEBUG oslo_concurrency.lockutils [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lock "5ea5b3e6-d6ee-4984-b938-32f34a5c3307-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.071 183087 DEBUG oslo_concurrency.lockutils [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lock "5ea5b3e6-d6ee-4984-b938-32f34a5c3307-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.072 183087 INFO nova.compute.manager [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Terminating instance
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.074 183087 DEBUG nova.compute.manager [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:47:57 compute-1 kernel: tap2fe37e18-bc (unregistering): left promiscuous mode
Jan 26 08:47:57 compute-1 NetworkManager[55451]: <info>  [1769417277.1019] device (tap2fe37e18-bc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 08:47:57 compute-1 ovn_controller[95352]: 2026-01-26T08:47:57Z|00137|binding|INFO|Releasing lport 2fe37e18-bc67-418d-b6bb-db4ed40f6645 from this chassis (sb_readonly=0)
Jan 26 08:47:57 compute-1 ovn_controller[95352]: 2026-01-26T08:47:57Z|00138|binding|INFO|Setting lport 2fe37e18-bc67-418d-b6bb-db4ed40f6645 down in Southbound
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.119 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:57 compute-1 ovn_controller[95352]: 2026-01-26T08:47:57Z|00139|binding|INFO|Removing iface tap2fe37e18-bc ovn-installed in OVS
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.125 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:57 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:57.138 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:c5:e2 10.100.0.3'], port_security=['fa:16:3e:80:c5:e2 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-internal-dns-test-port-1268661230', 'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '5ea5b3e6-d6ee-4984-b938-32f34a5c3307', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a023433-18d0-4d94-b2a4-84ce73a46ce0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-internal-dns-test-port-1268661230', 'neutron:project_id': '5cb899ce02444af2a1e102e390417350', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'af0d8f5a-e36f-4fde-9724-59b4ac44631c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.234'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ba9abea6-197c-4ec3-be8d-b9829bf5806a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=2fe37e18-bc67-418d-b6bb-db4ed40f6645) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:47:57 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:57.140 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 2fe37e18-bc67-418d-b6bb-db4ed40f6645 in datapath 4a023433-18d0-4d94-b2a4-84ce73a46ce0 unbound from our chassis
Jan 26 08:47:57 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:57.146 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4a023433-18d0-4d94-b2a4-84ce73a46ce0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 08:47:57 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:57.148 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b3ce419e-0265-4e62-9c9c-ee3e66b181a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:57 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:57.149 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0 namespace which is not needed anymore
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.150 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:57 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000017.scope: Deactivated successfully.
Jan 26 08:47:57 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000017.scope: Consumed 11.878s CPU time.
Jan 26 08:47:57 compute-1 systemd-machined[154360]: Machine qemu-6-instance-00000017 terminated.
Jan 26 08:47:57 compute-1 neutron-haproxy-ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0[214571]: [NOTICE]   (214588) : haproxy version is 2.8.14-c23fe91
Jan 26 08:47:57 compute-1 neutron-haproxy-ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0[214571]: [NOTICE]   (214588) : path to executable is /usr/sbin/haproxy
Jan 26 08:47:57 compute-1 neutron-haproxy-ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0[214571]: [WARNING]  (214588) : Exiting Master process...
Jan 26 08:47:57 compute-1 neutron-haproxy-ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0[214571]: [ALERT]    (214588) : Current worker (214592) exited with code 143 (Terminated)
Jan 26 08:47:57 compute-1 neutron-haproxy-ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0[214571]: [WARNING]  (214588) : All workers exited. Exiting... (0)
Jan 26 08:47:57 compute-1 systemd[1]: libpod-05e03eb6feac7543d278786e8fb3f24786f5f6a1f1d77223817078bc39271957.scope: Deactivated successfully.
Jan 26 08:47:57 compute-1 podman[214889]: 2026-01-26 08:47:57.329516147 +0000 UTC m=+0.062750901 container died 05e03eb6feac7543d278786e8fb3f24786f5f6a1f1d77223817078bc39271957 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.340 183087 INFO nova.virt.libvirt.driver [-] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Instance destroyed successfully.
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.341 183087 DEBUG nova.objects.instance [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lazy-loading 'resources' on Instance uuid 5ea5b3e6-d6ee-4984-b938-32f34a5c3307 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:47:57 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-05e03eb6feac7543d278786e8fb3f24786f5f6a1f1d77223817078bc39271957-userdata-shm.mount: Deactivated successfully.
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.358 183087 DEBUG nova.virt.libvirt.vif [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T08:47:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-internal-dns-test-vm-147244364',display_name='tempest-internal-dns-test-vm-147244364',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-internal-dns-test-vm-147244364',id=23,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNwasPBLZFj5TPdNWQK0zigHHODYc4f/t8DVnPh2+8sHe1pRQgl5SU5AsNYCK+hfXouKyyxOmTCGDhQxnfKOl2CR4mUnhy56u5TB5HoFqRICBu27dyxnXxwslEVaAoSfxw==',key_name='tempest-internal-dns-test-shared-keypair-1237484570',keypairs=<?>,launch_index=0,launched_at=2026-01-26T08:47:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5cb899ce02444af2a1e102e390417350',ramdisk_id='',reservation_id='r-ibp6cyse',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InternalDNSTestOvn-655560763',owner_user_name='tempest-InternalDNSTestOvn-655560763-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T08:47:39Z,user_data=None,user_id='6b6bdb69392c4caba079d21935c2fe08',uuid=5ea5b3e6-d6ee-4984-b938-32f34a5c3307,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "address": "fa:16:3e:80:c5:e2", "network": {"id": "4a023433-18d0-4d94-b2a4-84ce73a46ce0", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1072359363", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cb899ce02444af2a1e102e390417350", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fe37e18-bc", "ovs_interfaceid": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.358 183087 DEBUG nova.network.os_vif_util [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Converting VIF {"id": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "address": "fa:16:3e:80:c5:e2", "network": {"id": "4a023433-18d0-4d94-b2a4-84ce73a46ce0", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1072359363", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cb899ce02444af2a1e102e390417350", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fe37e18-bc", "ovs_interfaceid": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.359 183087 DEBUG nova.network.os_vif_util [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:c5:e2,bridge_name='br-int',has_traffic_filtering=True,id=2fe37e18-bc67-418d-b6bb-db4ed40f6645,network=Network(4a023433-18d0-4d94-b2a4-84ce73a46ce0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2fe37e18-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.359 183087 DEBUG os_vif [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:c5:e2,bridge_name='br-int',has_traffic_filtering=True,id=2fe37e18-bc67-418d-b6bb-db4ed40f6645,network=Network(4a023433-18d0-4d94-b2a4-84ce73a46ce0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2fe37e18-bc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.362 183087 DEBUG nova.network.neutron [req-3a315de6-0cce-45a3-9c3d-6e6ebee5d5e9 req-7ce2600c-e20c-4787-ac2a-972f64a15662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Updated VIF entry in instance network info cache for port 2fe37e18-bc67-418d-b6bb-db4ed40f6645. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.362 183087 DEBUG nova.network.neutron [req-3a315de6-0cce-45a3-9c3d-6e6ebee5d5e9 req-7ce2600c-e20c-4787-ac2a-972f64a15662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Updating instance_info_cache with network_info: [{"id": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "address": "fa:16:3e:80:c5:e2", "network": {"id": "4a023433-18d0-4d94-b2a4-84ce73a46ce0", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1072359363", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5cb899ce02444af2a1e102e390417350", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fe37e18-bc", "ovs_interfaceid": "2fe37e18-bc67-418d-b6bb-db4ed40f6645", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:57 compute-1 systemd[1]: var-lib-containers-storage-overlay-4691f3920eaae2c58c6535349688ac71bad6defb9eea29f6c76ef492147dd356-merged.mount: Deactivated successfully.
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.364 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.364 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2fe37e18-bc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.366 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.367 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.370 183087 INFO os_vif [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:c5:e2,bridge_name='br-int',has_traffic_filtering=True,id=2fe37e18-bc67-418d-b6bb-db4ed40f6645,network=Network(4a023433-18d0-4d94-b2a4-84ce73a46ce0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2fe37e18-bc')
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.370 183087 INFO nova.virt.libvirt.driver [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Deleting instance files /var/lib/nova/instances/5ea5b3e6-d6ee-4984-b938-32f34a5c3307_del
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.371 183087 INFO nova.virt.libvirt.driver [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Deletion of /var/lib/nova/instances/5ea5b3e6-d6ee-4984-b938-32f34a5c3307_del complete
Jan 26 08:47:57 compute-1 podman[214889]: 2026-01-26 08:47:57.374183705 +0000 UTC m=+0.107418459 container cleanup 05e03eb6feac7543d278786e8fb3f24786f5f6a1f1d77223817078bc39271957 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 08:47:57 compute-1 systemd[1]: libpod-conmon-05e03eb6feac7543d278786e8fb3f24786f5f6a1f1d77223817078bc39271957.scope: Deactivated successfully.
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.409 183087 DEBUG oslo_concurrency.lockutils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469] Traceback (most recent call last):
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]     raise exception.ImageUnacceptable(
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469] 
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469] During handling of the above exception, another exception occurred:
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469] 
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469] Traceback (most recent call last):
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]     yield resources
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]     created_disks = self._create_and_inject_local_root(
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]     image.cache(fetch_func=fetch_func,
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]     return f(*args, **kwargs)
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469]     raise exception.ImageUnacceptable(
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.410 183087 ERROR nova.compute.manager [instance: 162ac758-98af-43d0-ac84-2396124f1469] 
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.412 183087 DEBUG oslo_concurrency.lockutils [req-3a315de6-0cce-45a3-9c3d-6e6ebee5d5e9 req-7ce2600c-e20c-4787-ac2a-972f64a15662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-5ea5b3e6-d6ee-4984-b938-32f34a5c3307" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.413 183087 DEBUG nova.compute.manager [req-3a315de6-0cce-45a3-9c3d-6e6ebee5d5e9 req-7ce2600c-e20c-4787-ac2a-972f64a15662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Received event network-changed-0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.413 183087 DEBUG nova.compute.manager [req-3a315de6-0cce-45a3-9c3d-6e6ebee5d5e9 req-7ce2600c-e20c-4787-ac2a-972f64a15662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Refreshing instance network info cache due to event network-changed-0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.413 183087 DEBUG oslo_concurrency.lockutils [req-3a315de6-0cce-45a3-9c3d-6e6ebee5d5e9 req-7ce2600c-e20c-4787-ac2a-972f64a15662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-c77e8014-2e4b-48db-9cc2-dd37ccd4d528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.430 183087 INFO nova.compute.manager [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Took 0.36 seconds to destroy the instance on the hypervisor.
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.432 183087 DEBUG oslo.service.loopingcall [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.432 183087 DEBUG nova.compute.manager [-] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.433 183087 DEBUG nova.network.neutron [-] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:47:57 compute-1 podman[214935]: 2026-01-26 08:47:57.443847472 +0000 UTC m=+0.045132612 container remove 05e03eb6feac7543d278786e8fb3f24786f5f6a1f1d77223817078bc39271957 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:47:57 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:57.451 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[c110499e-f135-4fbe-9d79-f655309de808]: (4, ('Mon Jan 26 08:47:57 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0 (05e03eb6feac7543d278786e8fb3f24786f5f6a1f1d77223817078bc39271957)\n05e03eb6feac7543d278786e8fb3f24786f5f6a1f1d77223817078bc39271957\nMon Jan 26 08:47:57 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0 (05e03eb6feac7543d278786e8fb3f24786f5f6a1f1d77223817078bc39271957)\n05e03eb6feac7543d278786e8fb3f24786f5f6a1f1d77223817078bc39271957\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:57 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:57.453 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[f344c48a-2da4-4955-aaf2-b14529d3f294]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:57 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:57.454 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a023433-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.456 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:57 compute-1 kernel: tap4a023433-10: left promiscuous mode
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.459 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:57 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:57.464 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[710aa5ab-fc67-4fc9-8fec-cac050985087]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:57 compute-1 nova_compute[183083]: 2026-01-26 08:47:57.476 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:57 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:57.490 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[349de7f5-31ee-4135-934d-b2b993b7d9fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:57 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:57.493 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[57865e63-eda1-4033-9e42-d90818ba9a8f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:57 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:57.515 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[e689ae3f-032f-47a3-a04c-c3a64ee39846]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358714, 'reachable_time': 25548, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214950, 'error': None, 'target': 'ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:57 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:57.518 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4a023433-18d0-4d94-b2a4-84ce73a46ce0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 08:47:57 compute-1 systemd[1]: run-netns-ovnmeta\x2d4a023433\x2d18d0\x2d4d94\x2db2a4\x2d84ce73a46ce0.mount: Deactivated successfully.
Jan 26 08:47:57 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:47:57.518 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[f1e18d81-965b-4fca-a3ed-f6a4673a95d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:47:57 compute-1 podman[214952]: 2026-01-26 08:47:57.607142645 +0000 UTC m=+0.057647657 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 26 08:47:58 compute-1 ovn_controller[95352]: 2026-01-26T08:47:58Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:90:46:e8 10.100.0.6
Jan 26 08:47:58 compute-1 ovn_controller[95352]: 2026-01-26T08:47:58Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:90:46:e8 10.100.0.6
Jan 26 08:47:58 compute-1 nova_compute[183083]: 2026-01-26 08:47:58.362 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:58 compute-1 nova_compute[183083]: 2026-01-26 08:47:58.866 183087 DEBUG nova.network.neutron [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Successfully created port: 6c1aa358-176f-4add-88ef-d794ee47af72 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:47:58 compute-1 nova_compute[183083]: 2026-01-26 08:47:58.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:47:58 compute-1 nova_compute[183083]: 2026-01-26 08:47:58.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.098 183087 DEBUG nova.network.neutron [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Updating instance_info_cache with network_info: [{"id": "c49ab3d2-d1b7-4c99-b519-5b353e23bed5", "address": "fa:16:3e:39:dc:48", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc49ab3d2-d1", "ovs_interfaceid": "c49ab3d2-d1b7-4c99-b519-5b353e23bed5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07", "address": "fa:16:3e:2f:c0:f1", "network": {"id": "4d641e7c-1eab-4ddd-b2cb-cd83c8b75757", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::153", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e2e51ba-3b", "ovs_interfaceid": "0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.172 183087 DEBUG oslo_concurrency.lockutils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Releasing lock "refresh_cache-c77e8014-2e4b-48db-9cc2-dd37ccd4d528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.172 183087 DEBUG nova.compute.manager [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Instance network_info: |[{"id": "c49ab3d2-d1b7-4c99-b519-5b353e23bed5", "address": "fa:16:3e:39:dc:48", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc49ab3d2-d1", "ovs_interfaceid": "c49ab3d2-d1b7-4c99-b519-5b353e23bed5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07", "address": "fa:16:3e:2f:c0:f1", "network": {"id": "4d641e7c-1eab-4ddd-b2cb-cd83c8b75757", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::153", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e2e51ba-3b", "ovs_interfaceid": "0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.173 183087 DEBUG oslo_concurrency.lockutils [req-3a315de6-0cce-45a3-9c3d-6e6ebee5d5e9 req-7ce2600c-e20c-4787-ac2a-972f64a15662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-c77e8014-2e4b-48db-9cc2-dd37ccd4d528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.173 183087 DEBUG nova.network.neutron [req-3a315de6-0cce-45a3-9c3d-6e6ebee5d5e9 req-7ce2600c-e20c-4787-ac2a-972f64a15662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Refreshing network info cache for port 0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.175 183087 INFO nova.compute.manager [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Terminating instance
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.178 183087 DEBUG nova.compute.manager [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.182 183087 DEBUG nova.virt.libvirt.driver [-] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.183 183087 INFO nova.virt.libvirt.driver [-] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Instance destroyed successfully.
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.184 183087 DEBUG nova.virt.libvirt.vif [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:47:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-405827926-0',display_name='server-tempest-MultiPortVlanTransparencyTest-405827926-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-405827926-0',id=29,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLNC0niFIeKCfmMP15K31NI0Yt2/3arIz4qkOmJrDjDcT91Y+Z8t7cONdYZJsdWgcENTB4qxVqhqRFnFV/+0btrVE9vH3aNVT4hvrQf4Wf4VfrioZkKAghvtRLwxtntaPQ==',key_name='tempest-MultiPortVlanTransparencyTest-405827926',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9680eb9addb04171b834f7dff8ec602d',ramdisk_id='',reservation_id='r-0kh0eqj0',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-179273341',owner_user_name='tempest-MultiPortVlanTransparencyTest-179273341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:47:48Z,user_data=None,user_id='10286fb05759476d8b51274239727a28',uuid=c77e8014-2e4b-48db-9cc2-dd37ccd4d528,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c49ab3d2-d1b7-4c99-b519-5b353e23bed5", "address": "fa:16:3e:39:dc:48", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc49ab3d2-d1", "ovs_interfaceid": "c49ab3d2-d1b7-4c99-b519-5b353e23bed5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.184 183087 DEBUG nova.network.os_vif_util [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converting VIF {"id": "c49ab3d2-d1b7-4c99-b519-5b353e23bed5", "address": "fa:16:3e:39:dc:48", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc49ab3d2-d1", "ovs_interfaceid": "c49ab3d2-d1b7-4c99-b519-5b353e23bed5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.185 183087 DEBUG nova.network.os_vif_util [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:dc:48,bridge_name='br-int',has_traffic_filtering=True,id=c49ab3d2-d1b7-4c99-b519-5b353e23bed5,network=Network(65753891-6165-49db-aedc-996760bb86fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc49ab3d2-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.186 183087 DEBUG os_vif [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:dc:48,bridge_name='br-int',has_traffic_filtering=True,id=c49ab3d2-d1b7-4c99-b519-5b353e23bed5,network=Network(65753891-6165-49db-aedc-996760bb86fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc49ab3d2-d1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.188 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.188 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc49ab3d2-d1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.189 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.192 183087 INFO os_vif [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:dc:48,bridge_name='br-int',has_traffic_filtering=True,id=c49ab3d2-d1b7-4c99-b519-5b353e23bed5,network=Network(65753891-6165-49db-aedc-996760bb86fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc49ab3d2-d1')
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.194 183087 DEBUG nova.virt.libvirt.vif [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:47:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-405827926-0',display_name='server-tempest-MultiPortVlanTransparencyTest-405827926-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-405827926-0',id=29,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLNC0niFIeKCfmMP15K31NI0Yt2/3arIz4qkOmJrDjDcT91Y+Z8t7cONdYZJsdWgcENTB4qxVqhqRFnFV/+0btrVE9vH3aNVT4hvrQf4Wf4VfrioZkKAghvtRLwxtntaPQ==',key_name='tempest-MultiPortVlanTransparencyTest-405827926',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9680eb9addb04171b834f7dff8ec602d',ramdisk_id='',reservation_id='r-0kh0eqj0',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-179273341',owner_user_name='tempest-MultiPortVlanTransparencyTest-179273341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:47:48Z,user_data=None,user_id='10286fb05759476d8b51274239727a28',uuid=c77e8014-2e4b-48db-9cc2-dd37ccd4d528,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07", "address": "fa:16:3e:2f:c0:f1", "network": {"id": "4d641e7c-1eab-4ddd-b2cb-cd83c8b75757", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::153", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e2e51ba-3b", "ovs_interfaceid": "0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.195 183087 DEBUG nova.network.os_vif_util [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converting VIF {"id": "0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07", "address": "fa:16:3e:2f:c0:f1", "network": {"id": "4d641e7c-1eab-4ddd-b2cb-cd83c8b75757", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::153", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e2e51ba-3b", "ovs_interfaceid": "0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.196 183087 DEBUG nova.network.os_vif_util [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:c0:f1,bridge_name='br-int',has_traffic_filtering=True,id=0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07,network=Network(4d641e7c-1eab-4ddd-b2cb-cd83c8b75757),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0e2e51ba-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.196 183087 DEBUG os_vif [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:c0:f1,bridge_name='br-int',has_traffic_filtering=True,id=0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07,network=Network(4d641e7c-1eab-4ddd-b2cb-cd83c8b75757),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0e2e51ba-3b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.198 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.199 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e2e51ba-3b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.199 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.203 183087 INFO os_vif [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:c0:f1,bridge_name='br-int',has_traffic_filtering=True,id=0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07,network=Network(4d641e7c-1eab-4ddd-b2cb-cd83c8b75757),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0e2e51ba-3b')
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.204 183087 INFO nova.virt.libvirt.driver [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Deleting instance files /var/lib/nova/instances/c77e8014-2e4b-48db-9cc2-dd37ccd4d528_del
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.204 183087 INFO nova.virt.libvirt.driver [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Deletion of /var/lib/nova/instances/c77e8014-2e4b-48db-9cc2-dd37ccd4d528_del complete
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.286 183087 INFO nova.compute.manager [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Took 0.11 seconds to destroy the instance on the hypervisor.
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.288 183087 DEBUG nova.compute.claims [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c981262b0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.288 183087 DEBUG oslo_concurrency.lockutils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.288 183087 DEBUG oslo_concurrency.lockutils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.577 183087 DEBUG nova.compute.provider_tree [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.603 183087 DEBUG nova.scheduler.client.report [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.626 183087 DEBUG oslo_concurrency.lockutils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.627 183087 DEBUG nova.compute.utils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.627 183087 ERROR nova.compute.manager [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Build of instance c77e8014-2e4b-48db-9cc2-dd37ccd4d528 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance c77e8014-2e4b-48db-9cc2-dd37ccd4d528 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.628 183087 DEBUG nova.compute.manager [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.629 183087 DEBUG nova.virt.libvirt.vif [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:47:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-405827926-0',display_name='server-tempest-MultiPortVlanTransparencyTest-405827926-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='server-tempest-multiportvlantransparencytest-405827926-0',id=29,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLNC0niFIeKCfmMP15K31NI0Yt2/3arIz4qkOmJrDjDcT91Y+Z8t7cONdYZJsdWgcENTB4qxVqhqRFnFV/+0btrVE9vH3aNVT4hvrQf4Wf4VfrioZkKAghvtRLwxtntaPQ==',key_name='tempest-MultiPortVlanTransparencyTest-405827926',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9680eb9addb04171b834f7dff8ec602d',ramdisk_id='',reservation_id='r-0kh0eqj0',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-179273341',owner_user_name='tempest-MultiPortVlanTransparencyTest-179273341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:47:59Z,user_data=None,user_id='10286fb05759476d8b51274239727a28',uuid=c77e8014-2e4b-48db-9cc2-dd37ccd4d528,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c49ab3d2-d1b7-4c99-b519-5b353e23bed5", "address": "fa:16:3e:39:dc:48", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc49ab3d2-d1", "ovs_interfaceid": "c49ab3d2-d1b7-4c99-b519-5b353e23bed5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.629 183087 DEBUG nova.network.os_vif_util [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converting VIF {"id": "c49ab3d2-d1b7-4c99-b519-5b353e23bed5", "address": "fa:16:3e:39:dc:48", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc49ab3d2-d1", "ovs_interfaceid": "c49ab3d2-d1b7-4c99-b519-5b353e23bed5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.630 183087 DEBUG nova.network.os_vif_util [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:dc:48,bridge_name='br-int',has_traffic_filtering=True,id=c49ab3d2-d1b7-4c99-b519-5b353e23bed5,network=Network(65753891-6165-49db-aedc-996760bb86fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc49ab3d2-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.630 183087 DEBUG os_vif [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:dc:48,bridge_name='br-int',has_traffic_filtering=True,id=c49ab3d2-d1b7-4c99-b519-5b353e23bed5,network=Network(65753891-6165-49db-aedc-996760bb86fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc49ab3d2-d1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.631 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.632 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc49ab3d2-d1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.632 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.634 183087 INFO os_vif [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:dc:48,bridge_name='br-int',has_traffic_filtering=True,id=c49ab3d2-d1b7-4c99-b519-5b353e23bed5,network=Network(65753891-6165-49db-aedc-996760bb86fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc49ab3d2-d1')
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.635 183087 DEBUG nova.virt.libvirt.vif [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:47:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-405827926-0',display_name='server-tempest-MultiPortVlanTransparencyTest-405827926-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='server-tempest-multiportvlantransparencytest-405827926-0',id=29,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLNC0niFIeKCfmMP15K31NI0Yt2/3arIz4qkOmJrDjDcT91Y+Z8t7cONdYZJsdWgcENTB4qxVqhqRFnFV/+0btrVE9vH3aNVT4hvrQf4Wf4VfrioZkKAghvtRLwxtntaPQ==',key_name='tempest-MultiPortVlanTransparencyTest-405827926',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9680eb9addb04171b834f7dff8ec602d',ramdisk_id='',reservation_id='r-0kh0eqj0',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-179273341',owner_user_name='tempest-MultiPortVlanTransparencyTest-179273341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:47:59Z,user_data=None,user_id='10286fb05759476d8b51274239727a28',uuid=c77e8014-2e4b-48db-9cc2-dd37ccd4d528,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07", "address": "fa:16:3e:2f:c0:f1", "network": {"id": "4d641e7c-1eab-4ddd-b2cb-cd83c8b75757", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::153", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e2e51ba-3b", "ovs_interfaceid": "0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.635 183087 DEBUG nova.network.os_vif_util [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converting VIF {"id": "0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07", "address": "fa:16:3e:2f:c0:f1", "network": {"id": "4d641e7c-1eab-4ddd-b2cb-cd83c8b75757", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::153", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e2e51ba-3b", "ovs_interfaceid": "0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.635 183087 DEBUG nova.network.os_vif_util [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:c0:f1,bridge_name='br-int',has_traffic_filtering=True,id=0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07,network=Network(4d641e7c-1eab-4ddd-b2cb-cd83c8b75757),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0e2e51ba-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.636 183087 DEBUG os_vif [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:c0:f1,bridge_name='br-int',has_traffic_filtering=True,id=0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07,network=Network(4d641e7c-1eab-4ddd-b2cb-cd83c8b75757),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0e2e51ba-3b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.637 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.637 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e2e51ba-3b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.637 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.638 183087 INFO os_vif [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:c0:f1,bridge_name='br-int',has_traffic_filtering=True,id=0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07,network=Network(4d641e7c-1eab-4ddd-b2cb-cd83c8b75757),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0e2e51ba-3b')
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.639 183087 DEBUG nova.compute.manager [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.639 183087 DEBUG nova.compute.manager [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.639 183087 DEBUG nova.network.neutron [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.874 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.979 183087 DEBUG nova.network.neutron [-] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.987 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 26 08:47:59 compute-1 nova_compute[183083]: 2026-01-26 08:47:59.988 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Jan 26 08:48:00 compute-1 nova_compute[183083]: 2026-01-26 08:48:00.000 183087 INFO nova.compute.manager [-] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Took 2.57 seconds to deallocate network for instance.
Jan 26 08:48:00 compute-1 nova_compute[183083]: 2026-01-26 08:48:00.059 183087 DEBUG oslo_concurrency.lockutils [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:00 compute-1 nova_compute[183083]: 2026-01-26 08:48:00.060 183087 DEBUG oslo_concurrency.lockutils [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:00 compute-1 nova_compute[183083]: 2026-01-26 08:48:00.188 183087 DEBUG nova.compute.provider_tree [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:48:00 compute-1 nova_compute[183083]: 2026-01-26 08:48:00.208 183087 DEBUG nova.scheduler.client.report [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:48:00 compute-1 nova_compute[183083]: 2026-01-26 08:48:00.231 183087 DEBUG oslo_concurrency.lockutils [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:00 compute-1 nova_compute[183083]: 2026-01-26 08:48:00.253 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "refresh_cache-addae953-8eb4-46ed-959d-3c2bb6b31ee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:48:00 compute-1 nova_compute[183083]: 2026-01-26 08:48:00.253 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquired lock "refresh_cache-addae953-8eb4-46ed-959d-3c2bb6b31ee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:48:00 compute-1 nova_compute[183083]: 2026-01-26 08:48:00.253 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 08:48:00 compute-1 nova_compute[183083]: 2026-01-26 08:48:00.254 183087 DEBUG nova.objects.instance [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lazy-loading 'info_cache' on Instance uuid addae953-8eb4-46ed-959d-3c2bb6b31ee3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:48:00 compute-1 nova_compute[183083]: 2026-01-26 08:48:00.283 183087 INFO nova.scheduler.client.report [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Deleted allocations for instance 5ea5b3e6-d6ee-4984-b938-32f34a5c3307
Jan 26 08:48:00 compute-1 nova_compute[183083]: 2026-01-26 08:48:00.367 183087 DEBUG oslo_concurrency.lockutils [None req-a52d3d78-4798-4118-8c6f-a2689f73bb02 6b6bdb69392c4caba079d21935c2fe08 5cb899ce02444af2a1e102e390417350 - - default default] Lock "5ea5b3e6-d6ee-4984-b938-32f34a5c3307" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.298s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:00 compute-1 nova_compute[183083]: 2026-01-26 08:48:00.897 183087 DEBUG nova.network.neutron [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Successfully updated port: 6c1aa358-176f-4add-88ef-d794ee47af72 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:48:00 compute-1 nova_compute[183083]: 2026-01-26 08:48:00.924 183087 DEBUG oslo_concurrency.lockutils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "refresh_cache-162ac758-98af-43d0-ac84-2396124f1469" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:48:00 compute-1 nova_compute[183083]: 2026-01-26 08:48:00.925 183087 DEBUG oslo_concurrency.lockutils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquired lock "refresh_cache-162ac758-98af-43d0-ac84-2396124f1469" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:48:00 compute-1 nova_compute[183083]: 2026-01-26 08:48:00.925 183087 DEBUG nova.network.neutron [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:48:01 compute-1 nova_compute[183083]: 2026-01-26 08:48:01.443 183087 INFO nova.compute.manager [None req-318b4d3f-e016-49ac-91f6-a06ad69527e8 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Get console output
Jan 26 08:48:01 compute-1 nova_compute[183083]: 2026-01-26 08:48:01.450 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:48:01 compute-1 nova_compute[183083]: 2026-01-26 08:48:01.575 183087 DEBUG nova.network.neutron [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.205 183087 DEBUG nova.network.neutron [req-3a315de6-0cce-45a3-9c3d-6e6ebee5d5e9 req-7ce2600c-e20c-4787-ac2a-972f64a15662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Updated VIF entry in instance network info cache for port 0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.207 183087 DEBUG nova.network.neutron [req-3a315de6-0cce-45a3-9c3d-6e6ebee5d5e9 req-7ce2600c-e20c-4787-ac2a-972f64a15662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Updating instance_info_cache with network_info: [{"id": "c49ab3d2-d1b7-4c99-b519-5b353e23bed5", "address": "fa:16:3e:39:dc:48", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc49ab3d2-d1", "ovs_interfaceid": "c49ab3d2-d1b7-4c99-b519-5b353e23bed5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07", "address": "fa:16:3e:2f:c0:f1", "network": {"id": "4d641e7c-1eab-4ddd-b2cb-cd83c8b75757", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::153", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e2e51ba-3b", "ovs_interfaceid": "0e2e51ba-3b1b-49c0-b24c-dd22c2cd2f07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.234 183087 DEBUG nova.compute.manager [req-a9878c70-e413-4d20-8a1e-dbe0c6e5185f req-0e8df7d2-42a5-4271-bbe7-2cb48c1c0436 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Received event network-changed-6c1aa358-176f-4add-88ef-d794ee47af72 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.234 183087 DEBUG nova.compute.manager [req-a9878c70-e413-4d20-8a1e-dbe0c6e5185f req-0e8df7d2-42a5-4271-bbe7-2cb48c1c0436 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Refreshing instance network info cache due to event network-changed-6c1aa358-176f-4add-88ef-d794ee47af72. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.234 183087 DEBUG oslo_concurrency.lockutils [req-a9878c70-e413-4d20-8a1e-dbe0c6e5185f req-0e8df7d2-42a5-4271-bbe7-2cb48c1c0436 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-162ac758-98af-43d0-ac84-2396124f1469" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.243 183087 DEBUG oslo_concurrency.lockutils [req-3a315de6-0cce-45a3-9c3d-6e6ebee5d5e9 req-7ce2600c-e20c-4787-ac2a-972f64a15662 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-c77e8014-2e4b-48db-9cc2-dd37ccd4d528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.399 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.673 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Updating instance_info_cache with network_info: [{"id": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "address": "fa:16:3e:90:46:e8", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09872a2c-36", "ovs_interfaceid": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.719 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Releasing lock "refresh_cache-addae953-8eb4-46ed-959d-3c2bb6b31ee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.720 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.720 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.953 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.978 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.978 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.979 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:02 compute-1 nova_compute[183083]: 2026-01-26 08:48:02.979 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.053 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.144 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.145 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.172 183087 DEBUG oslo_concurrency.lockutils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "35765565-2f3c-4369-965d-ba48e70b6fef" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.172 183087 DEBUG oslo_concurrency.lockutils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "35765565-2f3c-4369-965d-ba48e70b6fef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.199 183087 DEBUG nova.compute.manager [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.240 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.368 183087 DEBUG oslo_concurrency.lockutils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.368 183087 DEBUG oslo_concurrency.lockutils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.375 183087 DEBUG nova.virt.hardware [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.375 183087 INFO nova.compute.claims [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.444 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.445 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13567MB free_disk=113.070068359375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.445 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.585 183087 DEBUG nova.compute.provider_tree [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.601 183087 DEBUG nova.scheduler.client.report [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.621 183087 DEBUG nova.network.neutron [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Updating instance_info_cache with network_info: [{"id": "6c1aa358-176f-4add-88ef-d794ee47af72", "address": "fa:16:3e:62:1f:10", "network": {"id": "073b474b-6072-4223-8b64-9d868d1efc3e", "bridge": "br-int", "label": "tempest-test-network--1351317551", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.148", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c1aa358-17", "ovs_interfaceid": "6c1aa358-176f-4add-88ef-d794ee47af72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.630 183087 DEBUG oslo_concurrency.lockutils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.631 183087 DEBUG nova.compute.manager [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.634 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.666 183087 DEBUG oslo_concurrency.lockutils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Releasing lock "refresh_cache-162ac758-98af-43d0-ac84-2396124f1469" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.667 183087 DEBUG nova.compute.manager [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Instance network_info: |[{"id": "6c1aa358-176f-4add-88ef-d794ee47af72", "address": "fa:16:3e:62:1f:10", "network": {"id": "073b474b-6072-4223-8b64-9d868d1efc3e", "bridge": "br-int", "label": "tempest-test-network--1351317551", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.148", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c1aa358-17", "ovs_interfaceid": "6c1aa358-176f-4add-88ef-d794ee47af72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.667 183087 DEBUG oslo_concurrency.lockutils [req-a9878c70-e413-4d20-8a1e-dbe0c6e5185f req-0e8df7d2-42a5-4271-bbe7-2cb48c1c0436 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-162ac758-98af-43d0-ac84-2396124f1469" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.667 183087 DEBUG nova.network.neutron [req-a9878c70-e413-4d20-8a1e-dbe0c6e5185f req-0e8df7d2-42a5-4271-bbe7-2cb48c1c0436 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Refreshing network info cache for port 6c1aa358-176f-4add-88ef-d794ee47af72 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.669 183087 INFO nova.compute.manager [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Terminating instance
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.670 183087 DEBUG nova.compute.manager [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.675 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 162ac758-98af-43d0-ac84-2396124f1469] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.675 183087 INFO nova.virt.libvirt.driver [-] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Instance destroyed successfully.
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.676 183087 DEBUG nova.virt.libvirt.vif [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:47:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_qos_after_live_migration-631497423',display_name='tempest-test_qos_after_live_migration-631497423',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-live-migration-631497423',id=30,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHVjp+2pOh+xYUkttf/EHrrYAH3LBOn+IKLzf3fiQpiaJslqkY+OmJn6bfd2cX/NEPdTL45qAcY0Zt6OwZRQXbHCoOcvnydr7uXjZCoGXOxoNL1bEhwXU4AaOmmyDzyYAA==',key_name='tempest-keypair-test-1026532318',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b71ae2b9d2fd454b8b3b9aa1a0e5c7e4',ramdisk_id='',reservation_id='r-h1e8mnao',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-374727467',owner_user_name='tempest-QosTestCommon-374727467-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:47:56Z,user_data=None,user_id='a7abeebb4e4d469c91e6cee77f6be1c3',uuid=162ac758-98af-43d0-ac84-2396124f1469,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6c1aa358-176f-4add-88ef-d794ee47af72", "address": "fa:16:3e:62:1f:10", "network": {"id": "073b474b-6072-4223-8b64-9d868d1efc3e", "bridge": "br-int", "label": "tempest-test-network--1351317551", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.148", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c1aa358-17", "ovs_interfaceid": "6c1aa358-176f-4add-88ef-d794ee47af72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.676 183087 DEBUG nova.network.os_vif_util [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converting VIF {"id": "6c1aa358-176f-4add-88ef-d794ee47af72", "address": "fa:16:3e:62:1f:10", "network": {"id": "073b474b-6072-4223-8b64-9d868d1efc3e", "bridge": "br-int", "label": "tempest-test-network--1351317551", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.148", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c1aa358-17", "ovs_interfaceid": "6c1aa358-176f-4add-88ef-d794ee47af72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.677 183087 DEBUG nova.network.os_vif_util [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:1f:10,bridge_name='br-int',has_traffic_filtering=True,id=6c1aa358-176f-4add-88ef-d794ee47af72,network=Network(073b474b-6072-4223-8b64-9d868d1efc3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c1aa358-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.678 183087 DEBUG os_vif [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:1f:10,bridge_name='br-int',has_traffic_filtering=True,id=6c1aa358-176f-4add-88ef-d794ee47af72,network=Network(073b474b-6072-4223-8b64-9d868d1efc3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c1aa358-17') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.679 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.680 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c1aa358-17, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.680 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.684 183087 INFO os_vif [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:1f:10,bridge_name='br-int',has_traffic_filtering=True,id=6c1aa358-176f-4add-88ef-d794ee47af72,network=Network(073b474b-6072-4223-8b64-9d868d1efc3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c1aa358-17')
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.685 183087 INFO nova.virt.libvirt.driver [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Deleting instance files /var/lib/nova/instances/162ac758-98af-43d0-ac84-2396124f1469_del
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.685 183087 INFO nova.virt.libvirt.driver [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Deletion of /var/lib/nova/instances/162ac758-98af-43d0-ac84-2396124f1469_del complete
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.742 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'name': 'tempest-server-test-1111872342', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000019', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '4a559c36b13649d98b2995c099340eb9', 'user_id': '52d582094c584036ba3e04c9da69ee02', 'hostId': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.744 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.776 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.device.read.requests volume: 1061 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.777 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4c771c0-b4a6-4d23-9d2a-e867e53d24e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1061, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3-vda', 'timestamp': '2026-01-26T08:48:03.744627', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ba18474c-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.406659239, 'message_signature': '5f686dbf701d255b6a97c0e55eec8e5ee453f1977891aabb99750e01591880c5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3-sda', 'timestamp': '2026-01-26T08:48:03.744627', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ba1853cc-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.406659239, 'message_signature': '4b2ce64f8341675bd4908703f1e16451a93d9584c130e8b61807e67a1bb31466'}]}, 'timestamp': '2026-01-26 08:48:03.777958', '_unique_id': '86373eb2112b443ba7c865ee0d8ba4a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.778 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.779 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.782 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for addae953-8eb4-46ed-959d-3c2bb6b31ee3 / tap09872a2c-36 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.782 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01aa716d-79bb-413f-ab9c-a91bd6523e9f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-00000019-addae953-8eb4-46ed-959d-3c2bb6b31ee3-tap09872a2c-36', 'timestamp': '2026-01-26T08:48:03.779728', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'tap09872a2c-36', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:46:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09872a2c-36'}, 'message_id': 'ba19241e-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.441714994, 'message_signature': 'daf555aac67db316aedf20d59ca5c96ea4952a84ab20a0cef52af250ecfccea7'}]}, 'timestamp': '2026-01-26 08:48:03.783305', '_unique_id': 'df621b93d62d40e29486cee9ad7d6817'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.783 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.784 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.793 183087 DEBUG nova.compute.manager [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.794 183087 DEBUG nova.network.neutron [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.795 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.796 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8370e402-1810-4f8b-a63c-638dd16fb695', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3-vda', 'timestamp': '2026-01-26T08:48:03.784609', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ba1b1fc6-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.446593192, 'message_signature': 'd313d72502822fe53740b50e812890b75424df14be997ba31d6062e97a425209'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3-sda', 'timestamp': '2026-01-26T08:48:03.784609', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ba1b2b10-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.446593192, 'message_signature': '323bd7df63c653af47b162483e5d6b8c8c9d827ffa7b3d67cafee5fdd51015bd'}]}, 'timestamp': '2026-01-26 08:48:03.796603', '_unique_id': '62c812e1a0a447c797e1b223646b1c5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.797 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.798 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.798 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/network.incoming.bytes volume: 1262 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d719864-b635-42bf-bb92-5a106823ecf8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1262, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-00000019-addae953-8eb4-46ed-959d-3c2bb6b31ee3-tap09872a2c-36', 'timestamp': '2026-01-26T08:48:03.798235', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'tap09872a2c-36', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:46:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09872a2c-36'}, 'message_id': 'ba1b7732-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.441714994, 'message_signature': '72f7d99ef4a4df0af25df3b5c2caf8ed56831cbe7a5a101f6cf7e274f4d2adcc'}]}, 'timestamp': '2026-01-26 08:48:03.798567', '_unique_id': 'c0a2997cebf443e1a20599da7d627253'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.799 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.800 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.800 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.800 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1111872342>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1111872342>]
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.800 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.800 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f042c83-b36c-4d93-9582-c6197113a885', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-00000019-addae953-8eb4-46ed-959d-3c2bb6b31ee3-tap09872a2c-36', 'timestamp': '2026-01-26T08:48:03.800531', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'tap09872a2c-36', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:46:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09872a2c-36'}, 'message_id': 'ba1bd146-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.441714994, 'message_signature': '1148b5b91d6b533cb08bd7563bb1098f537fa654769620d49e386975d06fa8c8'}]}, 'timestamp': '2026-01-26 08:48:03.800895', '_unique_id': 'e943bca3eb20492f93c1f206c83ba3a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.801 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.802 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.802 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.device.read.latency volume: 229018485 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.802 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.device.read.latency volume: 26812420 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58398ed6-20cf-482d-92d3-2c763951e142', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 229018485, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3-vda', 'timestamp': '2026-01-26T08:48:03.802414', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ba1c1a0c-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.406659239, 'message_signature': 'f8da8c146d177f4273296ca7b7ddd3afd69ce0a6bf9d00ab6cc4cdb3e83ac8b0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 26812420, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3-sda', 'timestamp': '2026-01-26T08:48:03.802414', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ba1c256a-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.406659239, 'message_signature': '1ef19fa3add04b07a28ef4abe3610b90bf23bdfdba8173df5409f6cdbc8dcc0c'}]}, 'timestamp': '2026-01-26 08:48:03.803007', '_unique_id': '9a7216b5591b4797aa91480b7069be57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.803 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.804 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.804 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '258b7a24-3e98-4f46-86cd-a531e75d892e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3-vda', 'timestamp': '2026-01-26T08:48:03.804700', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ba1c73ee-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.446593192, 'message_signature': 'a4e8db90bef000199e0f9ea36a9bafbd38efb673acafda4ef36a4e38c548ed82'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3-sda', 'timestamp': '2026-01-26T08:48:03.804700', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ba1c7f88-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.446593192, 'message_signature': '1955afc93178c2771f41d47cccbe6c556d81fff2f6abe6734ba99d7bf5355f47'}]}, 'timestamp': '2026-01-26 08:48:03.805301', '_unique_id': 'd2581b10a8e9486cae6a5300e61769ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.805 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.806 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.806 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/network.outgoing.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f17ac52-c7bf-48a6-afd6-e7a9e8b714b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-00000019-addae953-8eb4-46ed-959d-3c2bb6b31ee3-tap09872a2c-36', 'timestamp': '2026-01-26T08:48:03.806686', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'tap09872a2c-36', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:46:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09872a2c-36'}, 'message_id': 'ba1cc04c-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.441714994, 'message_signature': '816bcd4dcb38593ae4b87fa77c9a4a0c0d64eaf8ecdf220da70f5fcc471e1f05'}]}, 'timestamp': '2026-01-26 08:48:03.806971', '_unique_id': '53c76ba8b912428886c680ebc9bcf26f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.807 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.808 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.808 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e443b98-68cd-4e3a-bb27-15a4c1385564', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-00000019-addae953-8eb4-46ed-959d-3c2bb6b31ee3-tap09872a2c-36', 'timestamp': '2026-01-26T08:48:03.808308', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'tap09872a2c-36', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:46:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09872a2c-36'}, 'message_id': 'ba1cff80-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.441714994, 'message_signature': '2cb1b5fd0bb98c1ea61840bd90dc8039b1102b33c43004d6a5e63f49d2c0bcb7'}]}, 'timestamp': '2026-01-26 08:48:03.808591', '_unique_id': 'c41b8d95ce454c3cbcfcd39f01d30cb9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.809 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.824 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/cpu volume: 11020000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a00cbac9-dc97-47c6-8430-7e1df59c7459', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11020000000, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'timestamp': '2026-01-26T08:48:03.810004', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ba1f7cd8-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.48631857, 'message_signature': 'dbf75cb63a9ac0966b4aa9a2565e6ff7b66df107f87297420e98b0f12c4387a4'}]}, 'timestamp': '2026-01-26 08:48:03.824967', '_unique_id': '6f9bd0cac5fd4ab387f77da7f7f33313'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.825 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.826 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.826 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.826 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1111872342>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1111872342>]
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.826 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.827 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.device.write.requests volume: 302 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.827 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27c228b6-dcba-4645-a300-efd2532b3bb1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 302, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3-vda', 'timestamp': '2026-01-26T08:48:03.827065', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ba1fe02e-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.406659239, 'message_signature': 'c7120fcc0043af9ea1216018ca2b26fdddfcea405aa1b520161b8366a0cc883e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3-sda', 'timestamp': '2026-01-26T08:48:03.827065', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ba1febd2-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.406659239, 'message_signature': '3353072d99963cf0e98bae6dbd0875689a9112819b210a06e4569855bf755669'}]}, 'timestamp': '2026-01-26 08:48:03.827749', '_unique_id': '56377c344f7840429ad5fe5900332895'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.828 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.829 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.829 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.device.write.latency volume: 2225803718 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.829 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a99b2336-68df-4877-9ed2-66474937c47d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2225803718, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3-vda', 'timestamp': '2026-01-26T08:48:03.829365', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ba2036e6-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.406659239, 'message_signature': '2c1e0235961336ff6ca30850a2b4369453ed981bf44747add3af226fc6aa3353'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3-sda', 'timestamp': '2026-01-26T08:48:03.829365', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ba204226-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.406659239, 'message_signature': '1bd9a58e4548fa2ce1a907cae166c8e464dc6c1d080fe44ed8bd2bd5ee733adf'}]}, 'timestamp': '2026-01-26 08:48:03.829955', '_unique_id': '99fcb6b1774e41cb8e5dd74f0f32d03d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.830 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.831 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.831 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/network.outgoing.bytes volume: 1368 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0b49e25-4520-4afe-b4f4-67d7c0d27f54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1368, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-00000019-addae953-8eb4-46ed-959d-3c2bb6b31ee3-tap09872a2c-36', 'timestamp': '2026-01-26T08:48:03.831466', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'tap09872a2c-36', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:46:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09872a2c-36'}, 'message_id': 'ba20893e-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.441714994, 'message_signature': '4a27b0703ad46928f05064a250c90befdb789fc134453c8efdba0a1b38e1f55b'}]}, 'timestamp': '2026-01-26 08:48:03.831792', '_unique_id': '0c6c9bba0de948aea8af85bcc501e5a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.832 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.833 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.833 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4654572-c732-40cc-a460-fb3e28a3b0d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-00000019-addae953-8eb4-46ed-959d-3c2bb6b31ee3-tap09872a2c-36', 'timestamp': '2026-01-26T08:48:03.833262', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'tap09872a2c-36', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:46:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09872a2c-36'}, 'message_id': 'ba20cf0c-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.441714994, 'message_signature': '3650e62c536b9edb32e45b1a91c9c4a2842f143f5bb40218d59f85b91f4648bb'}]}, 'timestamp': '2026-01-26 08:48:03.833577', '_unique_id': 'f4eeb09709ac47399ee9a712e2e423a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.834 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.836 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance addae953-8eb4-46ed-959d-3c2bb6b31ee3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6d128a8-d8da-4082-92dd-826716e60bed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-00000019-addae953-8eb4-46ed-959d-3c2bb6b31ee3-tap09872a2c-36', 'timestamp': '2026-01-26T08:48:03.835008', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'tap09872a2c-36', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:46:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09872a2c-36'}, 'message_id': 'ba211444-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.441714994, 'message_signature': '83fd71c065ab3dfbf74675e6076ff11e62c0ad44203eb48c553757ac6b2a8d5d'}]}, 'timestamp': '2026-01-26 08:48:03.835349', '_unique_id': '587dbb4d19f6413da8a069bf47e3e025'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.835 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.836 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.836 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.836 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1111872342>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1111872342>]
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.837 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.837 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.837 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e11e0354-37e5-4c0e-b129-75beb0a80473', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3-vda', 'timestamp': '2026-01-26T08:48:03.837248', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ba216ade-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.446593192, 'message_signature': '5561d52e7e023e90937f058a696ca160e2d158af293a8dea3bb7f711fbe5b6e5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3-sda', 'timestamp': '2026-01-26T08:48:03.837248', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ba217600-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.446593192, 'message_signature': 'd7f2b24d19cc6a2e51a96767afb24ab90673f750011bfeb309122b03e323e77f'}]}, 'timestamp': '2026-01-26 08:48:03.837837', '_unique_id': 'c401639595a64747a077267de2fe1b06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.838 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.839 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.839 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.device.write.bytes volume: 72753152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.839 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '421c1fc3-3274-4299-a6df-af38a808b3e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72753152, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3-vda', 'timestamp': '2026-01-26T08:48:03.839326', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ba21bbce-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.406659239, 'message_signature': '085dce321fd1f07c25da1d86919f386142ef6cb4f542e7e5a04b2a99b949e8eb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3-sda', 'timestamp': '2026-01-26T08:48:03.839326', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ba21c6e6-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.406659239, 'message_signature': 'cb84cc7db09dbce1bb692594018632123929dfa6f7ae500f6390d97c3e031c5e'}]}, 'timestamp': '2026-01-26 08:48:03.839908', '_unique_id': '90cf49eab54041efbc0f328408cce171'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.840 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.841 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.841 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56920aaa-4987-4f3c-b0d2-52b05ee4ba9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-00000019-addae953-8eb4-46ed-959d-3c2bb6b31ee3-tap09872a2c-36', 'timestamp': '2026-01-26T08:48:03.841406', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'tap09872a2c-36', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:46:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09872a2c-36'}, 'message_id': 'ba220d54-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.441714994, 'message_signature': '784da3f637cf184ba054b46df0f4c942dc9537d7c3d191952f30a7f83d80554a'}]}, 'timestamp': '2026-01-26 08:48:03.841726', '_unique_id': '5471c3425319432f8e269e6af86ae01a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.842 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.843 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.843 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.device.read.bytes volume: 29391360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.843 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '660bed5d-4f9f-4f94-ae64-aeea4c66c2c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29391360, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3-vda', 'timestamp': '2026-01-26T08:48:03.843175', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ba225390-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.406659239, 'message_signature': '5c5dcf62b74b7fca9bea208d129291e874e4ed3495e6ce229f3eaf5b75dc3098'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3-sda', 'timestamp': '2026-01-26T08:48:03.843175', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ba225eda-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.406659239, 'message_signature': 'a4d89e9a80454c5a590de6c12e2d0427a69647a419d544a2c389353d68d3a03b'}]}, 'timestamp': '2026-01-26 08:48:03.843797', '_unique_id': '3cc42a941c8d46b29ade58d6439bd187'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.844 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.845 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.845 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/memory.usage volume: 40.453125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce634fa0-2a9b-40a6-a83f-9902d0429a12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.453125, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'timestamp': '2026-01-26T08:48:03.845293', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'instance-00000019', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'ba22a4e4-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.48631857, 'message_signature': 'e9018a618a18c4b2e475e2d89608b7533eb4231fd89c603c71a47c6b7e32c4cd'}]}, 'timestamp': '2026-01-26 08:48:03.845593', '_unique_id': '805c3467a1ea439593072ee97e42ad47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.846 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.847 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.847 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1111872342>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1111872342>]
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.847 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.847 12 DEBUG ceilometer.compute.pollsters [-] addae953-8eb4-46ed-959d-3c2bb6b31ee3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21a90428-896f-4e14-bacb-25851f02d23c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_name': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_name': None, 'resource_id': 'instance-00000019-addae953-8eb4-46ed-959d-3c2bb6b31ee3-tap09872a2c-36', 'timestamp': '2026-01-26T08:48:03.847436', 'resource_metadata': {'display_name': 'tempest-server-test-1111872342', 'name': 'tap09872a2c-36', 'instance_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'instance_type': 'm1.nano', 'host': '074e38aa8a4fddadea3d3e208edff0d6931096e646ec285cbfc94fa8', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:46:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09872a2c-36'}, 'message_id': 'ba22f8cc-fa93-11f0-b28a-fa163efc69df', 'monotonic_time': 3624.441714994, 'message_signature': '4095cf2de6f2fd8a1c95ed4f5e08d7545c124097a7c618c98d3cd030516f74e7'}]}, 'timestamp': '2026-01-26 08:48:03.847741', '_unique_id': '46d6cdfb6e17416e88ee354d80c3e505'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:48:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:48:03.848 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.938 183087 INFO nova.virt.libvirt.driver [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.965 183087 INFO nova.compute.manager [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Took 0.29 seconds to destroy the instance on the hypervisor.
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.966 183087 DEBUG nova.compute.claims [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Aborting claim: <nova.compute.claims.Claim object at 0x7f6cc406e4f0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.967 183087 DEBUG oslo_concurrency.lockutils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.967 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance c77e8014-2e4b-48db-9cc2-dd37ccd4d528 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.968 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance 162ac758-98af-43d0-ac84-2396124f1469 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.968 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance 35765565-2f3c-4369-965d-ba48e70b6fef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.968 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:48:03 compute-1 nova_compute[183083]: 2026-01-26 08:48:03.968 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2688MB phys_disk=119GB used_disk=21GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.118 183087 DEBUG nova.compute.manager [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.126 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.314 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.369 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.370 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.737s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.371 183087 DEBUG oslo_concurrency.lockutils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.405s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.479 183087 DEBUG nova.policy [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e29b04a4a66d43aaa5e5c4f38eeb59c4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5c33bd5e85114c868a4e91d997a5ceec', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.614 183087 DEBUG nova.compute.manager [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.616 183087 DEBUG nova.virt.libvirt.driver [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.616 183087 INFO nova.virt.libvirt.driver [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Creating image(s)
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.617 183087 DEBUG oslo_concurrency.lockutils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "/var/lib/nova/instances/35765565-2f3c-4369-965d-ba48e70b6fef/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.617 183087 DEBUG oslo_concurrency.lockutils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "/var/lib/nova/instances/35765565-2f3c-4369-965d-ba48e70b6fef/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.618 183087 DEBUG oslo_concurrency.lockutils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "/var/lib/nova/instances/35765565-2f3c-4369-965d-ba48e70b6fef/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.618 183087 DEBUG oslo_concurrency.lockutils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.619 183087 DEBUG oslo_concurrency.lockutils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:04 compute-1 ovn_controller[95352]: 2026-01-26T08:48:04Z|00140|pinctrl|WARN|Dropped 14309 log messages in last 61 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 26 08:48:04 compute-1 ovn_controller[95352]: 2026-01-26T08:48:04Z|00141|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.760 183087 DEBUG nova.network.neutron [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.842 183087 DEBUG nova.compute.provider_tree [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.935 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.961 183087 INFO nova.compute.manager [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: c77e8014-2e4b-48db-9cc2-dd37ccd4d528] Took 5.32 seconds to deallocate network for instance.
Jan 26 08:48:04 compute-1 nova_compute[183083]: 2026-01-26 08:48:04.977 183087 DEBUG nova.scheduler.client.report [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:48:05 compute-1 nova_compute[183083]: 2026-01-26 08:48:05.011 183087 DEBUG oslo_concurrency.lockutils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:05 compute-1 nova_compute[183083]: 2026-01-26 08:48:05.012 183087 DEBUG nova.compute.utils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:48:05 compute-1 nova_compute[183083]: 2026-01-26 08:48:05.013 183087 ERROR nova.compute.manager [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Build of instance 162ac758-98af-43d0-ac84-2396124f1469 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 162ac758-98af-43d0-ac84-2396124f1469 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:48:05 compute-1 nova_compute[183083]: 2026-01-26 08:48:05.014 183087 DEBUG nova.compute.manager [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:48:05 compute-1 nova_compute[183083]: 2026-01-26 08:48:05.014 183087 DEBUG nova.virt.libvirt.vif [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:47:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_qos_after_live_migration-631497423',display_name='tempest-test_qos_after_live_migration-631497423',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-test-qos-after-live-migration-631497423',id=30,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHVjp+2pOh+xYUkttf/EHrrYAH3LBOn+IKLzf3fiQpiaJslqkY+OmJn6bfd2cX/NEPdTL45qAcY0Zt6OwZRQXbHCoOcvnydr7uXjZCoGXOxoNL1bEhwXU4AaOmmyDzyYAA==',key_name='tempest-keypair-test-1026532318',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b71ae2b9d2fd454b8b3b9aa1a0e5c7e4',ramdisk_id='',reservation_id='r-h1e8mnao',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-374727467',owner_user_name='tempest-QosTestCommon-374727467-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:48:03Z,user_data=None,user_id='a7abeebb4e4d469c91e6cee77f6be1c3',uuid=162ac758-98af-43d0-ac84-2396124f1469,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6c1aa358-176f-4add-88ef-d794ee47af72", "address": "fa:16:3e:62:1f:10", "network": {"id": "073b474b-6072-4223-8b64-9d868d1efc3e", "bridge": "br-int", "label": "tempest-test-network--1351317551", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.148", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c1aa358-17", "ovs_interfaceid": "6c1aa358-176f-4add-88ef-d794ee47af72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:48:05 compute-1 nova_compute[183083]: 2026-01-26 08:48:05.015 183087 DEBUG nova.network.os_vif_util [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converting VIF {"id": "6c1aa358-176f-4add-88ef-d794ee47af72", "address": "fa:16:3e:62:1f:10", "network": {"id": "073b474b-6072-4223-8b64-9d868d1efc3e", "bridge": "br-int", "label": "tempest-test-network--1351317551", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.148", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c1aa358-17", "ovs_interfaceid": "6c1aa358-176f-4add-88ef-d794ee47af72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:48:05 compute-1 nova_compute[183083]: 2026-01-26 08:48:05.015 183087 DEBUG nova.network.os_vif_util [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:1f:10,bridge_name='br-int',has_traffic_filtering=True,id=6c1aa358-176f-4add-88ef-d794ee47af72,network=Network(073b474b-6072-4223-8b64-9d868d1efc3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c1aa358-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:48:05 compute-1 nova_compute[183083]: 2026-01-26 08:48:05.016 183087 DEBUG os_vif [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:1f:10,bridge_name='br-int',has_traffic_filtering=True,id=6c1aa358-176f-4add-88ef-d794ee47af72,network=Network(073b474b-6072-4223-8b64-9d868d1efc3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c1aa358-17') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:48:05 compute-1 nova_compute[183083]: 2026-01-26 08:48:05.017 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:05 compute-1 nova_compute[183083]: 2026-01-26 08:48:05.017 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c1aa358-17, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:05 compute-1 nova_compute[183083]: 2026-01-26 08:48:05.018 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:48:05 compute-1 nova_compute[183083]: 2026-01-26 08:48:05.020 183087 INFO os_vif [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:1f:10,bridge_name='br-int',has_traffic_filtering=True,id=6c1aa358-176f-4add-88ef-d794ee47af72,network=Network(073b474b-6072-4223-8b64-9d868d1efc3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c1aa358-17')
Jan 26 08:48:05 compute-1 nova_compute[183083]: 2026-01-26 08:48:05.020 183087 DEBUG nova.compute.manager [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:48:05 compute-1 nova_compute[183083]: 2026-01-26 08:48:05.021 183087 DEBUG nova.compute.manager [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:48:05 compute-1 nova_compute[183083]: 2026-01-26 08:48:05.021 183087 DEBUG nova.network.neutron [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:48:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:05.300 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:05.301 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:05.302 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:05 compute-1 nova_compute[183083]: 2026-01-26 08:48:05.555 183087 INFO nova.scheduler.client.report [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Deleted allocations for instance c77e8014-2e4b-48db-9cc2-dd37ccd4d528
Jan 26 08:48:05 compute-1 nova_compute[183083]: 2026-01-26 08:48:05.556 183087 DEBUG oslo_concurrency.lockutils [None req-badde224-62fd-4fa5-b4df-01c2701ad6ee 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "c77e8014-2e4b-48db-9cc2-dd37ccd4d528" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.558s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.144 183087 DEBUG oslo_concurrency.lockutils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.525s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Traceback (most recent call last):
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]     raise exception.ImageUnacceptable(
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] 
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] During handling of the above exception, another exception occurred:
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] 
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Traceback (most recent call last):
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]     yield resources
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]     created_disks = self._create_and_inject_local_root(
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]     image.cache(fetch_func=fetch_func,
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]     return f(*args, **kwargs)
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef]     raise exception.ImageUnacceptable(
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:48:06 compute-1 nova_compute[183083]: 2026-01-26 08:48:06.145 183087 ERROR nova.compute.manager [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] 
Jan 26 08:48:07 compute-1 nova_compute[183083]: 2026-01-26 08:48:07.403 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:07 compute-1 ovn_controller[95352]: 2026-01-26T08:48:07Z|00142|binding|INFO|Releasing lport 2e241887-a928-4a33-98c2-6228ee06108e from this chassis (sb_readonly=0)
Jan 26 08:48:07 compute-1 nova_compute[183083]: 2026-01-26 08:48:07.581 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:07 compute-1 nova_compute[183083]: 2026-01-26 08:48:07.717 183087 DEBUG nova.network.neutron [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Successfully created port: 89596298-bbfe-46d3-a766-792776734fdc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:48:07 compute-1 podman[215003]: 2026-01-26 08:48:07.822487499 +0000 UTC m=+0.079564778 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 08:48:07 compute-1 nova_compute[183083]: 2026-01-26 08:48:07.826 183087 DEBUG nova.network.neutron [req-a9878c70-e413-4d20-8a1e-dbe0c6e5185f req-0e8df7d2-42a5-4271-bbe7-2cb48c1c0436 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Updated VIF entry in instance network info cache for port 6c1aa358-176f-4add-88ef-d794ee47af72. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:48:07 compute-1 nova_compute[183083]: 2026-01-26 08:48:07.826 183087 DEBUG nova.network.neutron [req-a9878c70-e413-4d20-8a1e-dbe0c6e5185f req-0e8df7d2-42a5-4271-bbe7-2cb48c1c0436 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Updating instance_info_cache with network_info: [{"id": "6c1aa358-176f-4add-88ef-d794ee47af72", "address": "fa:16:3e:62:1f:10", "network": {"id": "073b474b-6072-4223-8b64-9d868d1efc3e", "bridge": "br-int", "label": "tempest-test-network--1351317551", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.148", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b71ae2b9d2fd454b8b3b9aa1a0e5c7e4", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c1aa358-17", "ovs_interfaceid": "6c1aa358-176f-4add-88ef-d794ee47af72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:07 compute-1 nova_compute[183083]: 2026-01-26 08:48:07.906 183087 DEBUG oslo_concurrency.lockutils [req-a9878c70-e413-4d20-8a1e-dbe0c6e5185f req-0e8df7d2-42a5-4271-bbe7-2cb48c1c0436 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-162ac758-98af-43d0-ac84-2396124f1469" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:48:08 compute-1 nova_compute[183083]: 2026-01-26 08:48:08.480 183087 DEBUG nova.network.neutron [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:08 compute-1 nova_compute[183083]: 2026-01-26 08:48:08.517 183087 INFO nova.compute.manager [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] [instance: 162ac758-98af-43d0-ac84-2396124f1469] Took 3.50 seconds to deallocate network for instance.
Jan 26 08:48:08 compute-1 nova_compute[183083]: 2026-01-26 08:48:08.645 183087 DEBUG nova.compute.manager [req-9016a600-d1e2-43b9-947b-0b3d3f0f0a3c req-f4319b68-9455-4982-87b3-74d565f2b7bb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Received event network-changed-09872a2c-368e-4dca-93d8-5fdd642b03b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:48:08 compute-1 nova_compute[183083]: 2026-01-26 08:48:08.646 183087 DEBUG nova.compute.manager [req-9016a600-d1e2-43b9-947b-0b3d3f0f0a3c req-f4319b68-9455-4982-87b3-74d565f2b7bb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Refreshing instance network info cache due to event network-changed-09872a2c-368e-4dca-93d8-5fdd642b03b3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:48:08 compute-1 nova_compute[183083]: 2026-01-26 08:48:08.646 183087 DEBUG oslo_concurrency.lockutils [req-9016a600-d1e2-43b9-947b-0b3d3f0f0a3c req-f4319b68-9455-4982-87b3-74d565f2b7bb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-addae953-8eb4-46ed-959d-3c2bb6b31ee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:48:08 compute-1 nova_compute[183083]: 2026-01-26 08:48:08.646 183087 DEBUG oslo_concurrency.lockutils [req-9016a600-d1e2-43b9-947b-0b3d3f0f0a3c req-f4319b68-9455-4982-87b3-74d565f2b7bb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-addae953-8eb4-46ed-959d-3c2bb6b31ee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:48:08 compute-1 nova_compute[183083]: 2026-01-26 08:48:08.646 183087 DEBUG nova.network.neutron [req-9016a600-d1e2-43b9-947b-0b3d3f0f0a3c req-f4319b68-9455-4982-87b3-74d565f2b7bb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Refreshing network info cache for port 09872a2c-368e-4dca-93d8-5fdd642b03b3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:48:08 compute-1 nova_compute[183083]: 2026-01-26 08:48:08.757 183087 INFO nova.scheduler.client.report [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Deleted allocations for instance 162ac758-98af-43d0-ac84-2396124f1469
Jan 26 08:48:08 compute-1 nova_compute[183083]: 2026-01-26 08:48:08.758 183087 DEBUG oslo_concurrency.lockutils [None req-d10a9841-adbc-4909-9a29-0de5ef5f9b80 a7abeebb4e4d469c91e6cee77f6be1c3 b71ae2b9d2fd454b8b3b9aa1a0e5c7e4 - - default default] Lock "162ac758-98af-43d0-ac84-2396124f1469" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:09 compute-1 nova_compute[183083]: 2026-01-26 08:48:09.938 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:10 compute-1 nova_compute[183083]: 2026-01-26 08:48:10.204 183087 DEBUG nova.network.neutron [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Successfully updated port: 89596298-bbfe-46d3-a766-792776734fdc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:48:10 compute-1 nova_compute[183083]: 2026-01-26 08:48:10.218 183087 DEBUG oslo_concurrency.lockutils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "refresh_cache-35765565-2f3c-4369-965d-ba48e70b6fef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:48:10 compute-1 nova_compute[183083]: 2026-01-26 08:48:10.219 183087 DEBUG oslo_concurrency.lockutils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquired lock "refresh_cache-35765565-2f3c-4369-965d-ba48e70b6fef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:48:10 compute-1 nova_compute[183083]: 2026-01-26 08:48:10.219 183087 DEBUG nova.network.neutron [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:48:10 compute-1 nova_compute[183083]: 2026-01-26 08:48:10.714 183087 DEBUG nova.network.neutron [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:48:11 compute-1 nova_compute[183083]: 2026-01-26 08:48:11.249 183087 DEBUG nova.compute.manager [req-b3584a0e-b404-4f9c-9c58-7e16aee63dd1 req-3f734cba-7314-4e32-84a2-b18ee85c146f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Received event network-changed-89596298-bbfe-46d3-a766-792776734fdc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:48:11 compute-1 nova_compute[183083]: 2026-01-26 08:48:11.249 183087 DEBUG nova.compute.manager [req-b3584a0e-b404-4f9c-9c58-7e16aee63dd1 req-3f734cba-7314-4e32-84a2-b18ee85c146f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Refreshing instance network info cache due to event network-changed-89596298-bbfe-46d3-a766-792776734fdc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:48:11 compute-1 nova_compute[183083]: 2026-01-26 08:48:11.250 183087 DEBUG oslo_concurrency.lockutils [req-b3584a0e-b404-4f9c-9c58-7e16aee63dd1 req-3f734cba-7314-4e32-84a2-b18ee85c146f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-35765565-2f3c-4369-965d-ba48e70b6fef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:48:11 compute-1 nova_compute[183083]: 2026-01-26 08:48:11.313 183087 DEBUG nova.network.neutron [req-9016a600-d1e2-43b9-947b-0b3d3f0f0a3c req-f4319b68-9455-4982-87b3-74d565f2b7bb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Updated VIF entry in instance network info cache for port 09872a2c-368e-4dca-93d8-5fdd642b03b3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:48:11 compute-1 nova_compute[183083]: 2026-01-26 08:48:11.314 183087 DEBUG nova.network.neutron [req-9016a600-d1e2-43b9-947b-0b3d3f0f0a3c req-f4319b68-9455-4982-87b3-74d565f2b7bb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Updating instance_info_cache with network_info: [{"id": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "address": "fa:16:3e:90:46:e8", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09872a2c-36", "ovs_interfaceid": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:11 compute-1 nova_compute[183083]: 2026-01-26 08:48:11.424 183087 DEBUG oslo_concurrency.lockutils [req-9016a600-d1e2-43b9-947b-0b3d3f0f0a3c req-f4319b68-9455-4982-87b3-74d565f2b7bb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-addae953-8eb4-46ed-959d-3c2bb6b31ee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:48:11 compute-1 nova_compute[183083]: 2026-01-26 08:48:11.935 183087 DEBUG nova.network.neutron [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Updating instance_info_cache with network_info: [{"id": "89596298-bbfe-46d3-a766-792776734fdc", "address": "fa:16:3e:81:37:12", "network": {"id": "1f142e23-ddc9-41fb-b330-d04df2cda6d8", "bridge": "br-int", "label": "tempest-test-network--551489493", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.183", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89596298-bb", "ovs_interfaceid": "89596298-bbfe-46d3-a766-792776734fdc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.133 183087 DEBUG oslo_concurrency.lockutils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Releasing lock "refresh_cache-35765565-2f3c-4369-965d-ba48e70b6fef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.134 183087 DEBUG nova.compute.manager [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Instance network_info: |[{"id": "89596298-bbfe-46d3-a766-792776734fdc", "address": "fa:16:3e:81:37:12", "network": {"id": "1f142e23-ddc9-41fb-b330-d04df2cda6d8", "bridge": "br-int", "label": "tempest-test-network--551489493", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.183", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89596298-bb", "ovs_interfaceid": "89596298-bbfe-46d3-a766-792776734fdc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.135 183087 DEBUG oslo_concurrency.lockutils [req-b3584a0e-b404-4f9c-9c58-7e16aee63dd1 req-3f734cba-7314-4e32-84a2-b18ee85c146f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-35765565-2f3c-4369-965d-ba48e70b6fef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.135 183087 DEBUG nova.network.neutron [req-b3584a0e-b404-4f9c-9c58-7e16aee63dd1 req-3f734cba-7314-4e32-84a2-b18ee85c146f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Refreshing network info cache for port 89596298-bbfe-46d3-a766-792776734fdc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.137 183087 INFO nova.compute.manager [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Terminating instance
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.139 183087 DEBUG nova.compute.manager [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.145 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.146 183087 INFO nova.virt.libvirt.driver [-] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Instance destroyed successfully.
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.147 183087 DEBUG nova.virt.libvirt.vif [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:48:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_multicast_east_west-1477026537',display_name='tempest-test_multicast_east_west-1477026537',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-multicast-east-west-1477026537',id=31,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJOJsa8zDc5tOBfBRLm0Qi812u7HVOO6E7MXAGpKZ4/7op/PkClYPLhHmkNgkuxZ09O67J1SPnUd6CxZdN+euoFRi16VYgHaqYzJwS9D1WUns9/BQk7M5SX/0drgiuby9w==',key_name='tempest-keypair-test-1151010750',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c33bd5e85114c868a4e91d997a5ceec',ramdisk_id='',reservation_id='r-9ewimotx',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-635971062',owner_user_name='tempest-MulticastTestIPv4Ovn-635971062-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:48:04Z,user_data=None,user_id='e29b04a4a66d43aaa5e5c4f38eeb59c4',uuid=35765565-2f3c-4369-965d-ba48e70b6fef,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "89596298-bbfe-46d3-a766-792776734fdc", "address": "fa:16:3e:81:37:12", "network": {"id": "1f142e23-ddc9-41fb-b330-d04df2cda6d8", "bridge": "br-int", "label": "tempest-test-network--551489493", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.183", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89596298-bb", "ovs_interfaceid": "89596298-bbfe-46d3-a766-792776734fdc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.147 183087 DEBUG nova.network.os_vif_util [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Converting VIF {"id": "89596298-bbfe-46d3-a766-792776734fdc", "address": "fa:16:3e:81:37:12", "network": {"id": "1f142e23-ddc9-41fb-b330-d04df2cda6d8", "bridge": "br-int", "label": "tempest-test-network--551489493", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.183", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89596298-bb", "ovs_interfaceid": "89596298-bbfe-46d3-a766-792776734fdc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.148 183087 DEBUG nova.network.os_vif_util [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:37:12,bridge_name='br-int',has_traffic_filtering=True,id=89596298-bbfe-46d3-a766-792776734fdc,network=Network(1f142e23-ddc9-41fb-b330-d04df2cda6d8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89596298-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.149 183087 DEBUG os_vif [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:37:12,bridge_name='br-int',has_traffic_filtering=True,id=89596298-bbfe-46d3-a766-792776734fdc,network=Network(1f142e23-ddc9-41fb-b330-d04df2cda6d8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89596298-bb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.151 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.152 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89596298-bb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.152 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.155 183087 INFO os_vif [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:37:12,bridge_name='br-int',has_traffic_filtering=True,id=89596298-bbfe-46d3-a766-792776734fdc,network=Network(1f142e23-ddc9-41fb-b330-d04df2cda6d8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89596298-bb')
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.156 183087 INFO nova.virt.libvirt.driver [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Deleting instance files /var/lib/nova/instances/35765565-2f3c-4369-965d-ba48e70b6fef_del
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.157 183087 INFO nova.virt.libvirt.driver [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Deletion of /var/lib/nova/instances/35765565-2f3c-4369-965d-ba48e70b6fef_del complete
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.242 183087 INFO nova.compute.manager [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Took 0.10 seconds to destroy the instance on the hypervisor.
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.244 183087 DEBUG nova.compute.claims [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c9838c6a0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.244 183087 DEBUG oslo_concurrency.lockutils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.245 183087 DEBUG oslo_concurrency.lockutils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.333 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769417277.3318114, 5ea5b3e6-d6ee-4984-b938-32f34a5c3307 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.334 183087 INFO nova.compute.manager [-] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] VM Stopped (Lifecycle Event)
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.350 183087 DEBUG nova.compute.manager [None req-2b989b5d-991a-48be-8b7d-0ba3bc90c94d - - - - - -] [instance: 5ea5b3e6-d6ee-4984-b938-32f34a5c3307] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.408 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.423 183087 DEBUG nova.compute.provider_tree [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.517 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.518 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.529 183087 DEBUG nova.scheduler.client.report [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.550 183087 DEBUG nova.compute.manager [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.582 183087 DEBUG oslo_concurrency.lockutils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.583 183087 DEBUG nova.compute.utils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.585 183087 ERROR nova.compute.manager [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Build of instance 35765565-2f3c-4369-965d-ba48e70b6fef aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 35765565-2f3c-4369-965d-ba48e70b6fef aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.585 183087 DEBUG nova.compute.manager [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.587 183087 DEBUG nova.virt.libvirt.vif [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:48:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_multicast_east_west-1477026537',display_name='tempest-test_multicast_east_west-1477026537',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-test-multicast-east-west-1477026537',id=31,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJOJsa8zDc5tOBfBRLm0Qi812u7HVOO6E7MXAGpKZ4/7op/PkClYPLhHmkNgkuxZ09O67J1SPnUd6CxZdN+euoFRi16VYgHaqYzJwS9D1WUns9/BQk7M5SX/0drgiuby9w==',key_name='tempest-keypair-test-1151010750',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c33bd5e85114c868a4e91d997a5ceec',ramdisk_id='',reservation_id='r-9ewimotx',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-635971062',owner_user_name='tempest-MulticastTestIPv4Ovn-635971062-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:48:12Z,user_data=None,user_id='e29b04a4a66d43aaa5e5c4f38eeb59c4',uuid=35765565-2f3c-4369-965d-ba48e70b6fef,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "89596298-bbfe-46d3-a766-792776734fdc", "address": "fa:16:3e:81:37:12", "network": {"id": "1f142e23-ddc9-41fb-b330-d04df2cda6d8", "bridge": "br-int", "label": "tempest-test-network--551489493", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.183", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89596298-bb", "ovs_interfaceid": "89596298-bbfe-46d3-a766-792776734fdc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.587 183087 DEBUG nova.network.os_vif_util [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Converting VIF {"id": "89596298-bbfe-46d3-a766-792776734fdc", "address": "fa:16:3e:81:37:12", "network": {"id": "1f142e23-ddc9-41fb-b330-d04df2cda6d8", "bridge": "br-int", "label": "tempest-test-network--551489493", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.183", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89596298-bb", "ovs_interfaceid": "89596298-bbfe-46d3-a766-792776734fdc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.589 183087 DEBUG nova.network.os_vif_util [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:37:12,bridge_name='br-int',has_traffic_filtering=True,id=89596298-bbfe-46d3-a766-792776734fdc,network=Network(1f142e23-ddc9-41fb-b330-d04df2cda6d8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89596298-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.590 183087 DEBUG os_vif [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:37:12,bridge_name='br-int',has_traffic_filtering=True,id=89596298-bbfe-46d3-a766-792776734fdc,network=Network(1f142e23-ddc9-41fb-b330-d04df2cda6d8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89596298-bb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.592 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.592 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89596298-bb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.593 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.597 183087 INFO os_vif [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:37:12,bridge_name='br-int',has_traffic_filtering=True,id=89596298-bbfe-46d3-a766-792776734fdc,network=Network(1f142e23-ddc9-41fb-b330-d04df2cda6d8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89596298-bb')
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.598 183087 DEBUG nova.compute.manager [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.599 183087 DEBUG nova.compute.manager [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.599 183087 DEBUG nova.network.neutron [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.624 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.625 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.635 183087 DEBUG nova.virt.hardware [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.635 183087 INFO nova.compute.claims [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.889 183087 DEBUG nova.compute.provider_tree [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.907 183087 DEBUG nova.scheduler.client.report [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.973 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:12 compute-1 nova_compute[183083]: 2026-01-26 08:48:12.974 183087 DEBUG nova.compute.manager [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:48:13 compute-1 nova_compute[183083]: 2026-01-26 08:48:13.172 183087 DEBUG nova.compute.manager [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:48:13 compute-1 nova_compute[183083]: 2026-01-26 08:48:13.172 183087 DEBUG nova.network.neutron [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:48:13 compute-1 nova_compute[183083]: 2026-01-26 08:48:13.305 183087 INFO nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:48:13 compute-1 nova_compute[183083]: 2026-01-26 08:48:13.438 183087 DEBUG nova.compute.manager [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:48:13 compute-1 nova_compute[183083]: 2026-01-26 08:48:13.840 183087 DEBUG nova.compute.manager [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:48:13 compute-1 nova_compute[183083]: 2026-01-26 08:48:13.842 183087 DEBUG nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:48:13 compute-1 nova_compute[183083]: 2026-01-26 08:48:13.842 183087 INFO nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Creating image(s)
Jan 26 08:48:13 compute-1 nova_compute[183083]: 2026-01-26 08:48:13.843 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "/var/lib/nova/instances/456bc5cb-4197-47ac-b895-7d4e8b3b5e58/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:13 compute-1 nova_compute[183083]: 2026-01-26 08:48:13.843 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "/var/lib/nova/instances/456bc5cb-4197-47ac-b895-7d4e8b3b5e58/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:13 compute-1 nova_compute[183083]: 2026-01-26 08:48:13.844 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "/var/lib/nova/instances/456bc5cb-4197-47ac-b895-7d4e8b3b5e58/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:13 compute-1 nova_compute[183083]: 2026-01-26 08:48:13.860 183087 DEBUG oslo_concurrency.processutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:48:13 compute-1 nova_compute[183083]: 2026-01-26 08:48:13.945 183087 DEBUG oslo_concurrency.processutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:48:13 compute-1 nova_compute[183083]: 2026-01-26 08:48:13.946 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:13 compute-1 nova_compute[183083]: 2026-01-26 08:48:13.947 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:13 compute-1 nova_compute[183083]: 2026-01-26 08:48:13.973 183087 DEBUG oslo_concurrency.processutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.036 183087 DEBUG oslo_concurrency.processutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.038 183087 DEBUG oslo_concurrency.processutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/456bc5cb-4197-47ac-b895-7d4e8b3b5e58/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.124 183087 DEBUG oslo_concurrency.processutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/456bc5cb-4197-47ac-b895-7d4e8b3b5e58/disk 1073741824" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.127 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.128 183087 DEBUG oslo_concurrency.processutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.207 183087 DEBUG oslo_concurrency.processutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.209 183087 DEBUG nova.virt.disk.api [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Checking if we can resize image /var/lib/nova/instances/456bc5cb-4197-47ac-b895-7d4e8b3b5e58/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.210 183087 DEBUG oslo_concurrency.processutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456bc5cb-4197-47ac-b895-7d4e8b3b5e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.267 183087 DEBUG oslo_concurrency.processutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456bc5cb-4197-47ac-b895-7d4e8b3b5e58/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.269 183087 DEBUG nova.virt.disk.api [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Cannot resize image /var/lib/nova/instances/456bc5cb-4197-47ac-b895-7d4e8b3b5e58/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.270 183087 DEBUG nova.objects.instance [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lazy-loading 'migration_context' on Instance uuid 456bc5cb-4197-47ac-b895-7d4e8b3b5e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.306 183087 DEBUG nova.policy [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '52d582094c584036ba3e04c9da69ee02', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4a559c36b13649d98b2995c099340eb9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.320 183087 DEBUG nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.320 183087 DEBUG nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Ensure instance console log exists: /var/lib/nova/instances/456bc5cb-4197-47ac-b895-7d4e8b3b5e58/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.321 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.322 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.322 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.467 183087 DEBUG nova.network.neutron [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.519 183087 INFO nova.compute.manager [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Took 1.92 seconds to deallocate network for instance.
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.787 183087 INFO nova.scheduler.client.report [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Deleted allocations for instance 35765565-2f3c-4369-965d-ba48e70b6fef
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.788 183087 DEBUG oslo_concurrency.lockutils [None req-45ae10e9-a0b0-47b2-a1e8-518f92ed793f e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "35765565-2f3c-4369-965d-ba48e70b6fef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:14 compute-1 nova_compute[183083]: 2026-01-26 08:48:14.994 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:15 compute-1 nova_compute[183083]: 2026-01-26 08:48:15.069 183087 DEBUG nova.network.neutron [req-b3584a0e-b404-4f9c-9c58-7e16aee63dd1 req-3f734cba-7314-4e32-84a2-b18ee85c146f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Updated VIF entry in instance network info cache for port 89596298-bbfe-46d3-a766-792776734fdc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:48:15 compute-1 nova_compute[183083]: 2026-01-26 08:48:15.070 183087 DEBUG nova.network.neutron [req-b3584a0e-b404-4f9c-9c58-7e16aee63dd1 req-3f734cba-7314-4e32-84a2-b18ee85c146f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 35765565-2f3c-4369-965d-ba48e70b6fef] Updating instance_info_cache with network_info: [{"id": "89596298-bbfe-46d3-a766-792776734fdc", "address": "fa:16:3e:81:37:12", "network": {"id": "1f142e23-ddc9-41fb-b330-d04df2cda6d8", "bridge": "br-int", "label": "tempest-test-network--551489493", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.183", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c33bd5e85114c868a4e91d997a5ceec", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89596298-bb", "ovs_interfaceid": "89596298-bbfe-46d3-a766-792776734fdc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:15 compute-1 nova_compute[183083]: 2026-01-26 08:48:15.285 183087 DEBUG oslo_concurrency.lockutils [req-b3584a0e-b404-4f9c-9c58-7e16aee63dd1 req-3f734cba-7314-4e32-84a2-b18ee85c146f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-35765565-2f3c-4369-965d-ba48e70b6fef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:48:15 compute-1 nova_compute[183083]: 2026-01-26 08:48:15.680 183087 DEBUG nova.network.neutron [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Successfully updated port: a14c1ec1-db66-4519-83e7-c1d8ebc20851 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:48:15 compute-1 nova_compute[183083]: 2026-01-26 08:48:15.788 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "refresh_cache-456bc5cb-4197-47ac-b895-7d4e8b3b5e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:48:15 compute-1 nova_compute[183083]: 2026-01-26 08:48:15.788 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquired lock "refresh_cache-456bc5cb-4197-47ac-b895-7d4e8b3b5e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:48:15 compute-1 nova_compute[183083]: 2026-01-26 08:48:15.789 183087 DEBUG nova.network.neutron [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:48:15 compute-1 ovn_controller[95352]: 2026-01-26T08:48:15Z|00143|binding|INFO|Releasing lport 2e241887-a928-4a33-98c2-6228ee06108e from this chassis (sb_readonly=0)
Jan 26 08:48:16 compute-1 nova_compute[183083]: 2026-01-26 08:48:16.024 183087 DEBUG nova.network.neutron [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:48:16 compute-1 nova_compute[183083]: 2026-01-26 08:48:16.027 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:16 compute-1 nova_compute[183083]: 2026-01-26 08:48:16.744 183087 DEBUG nova.compute.manager [req-bd661fe6-1ab5-41a9-851a-b7b3fed9d5cd req-bd40ea14-165f-4db4-9bf0-fbcf854e41bb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Received event network-changed-a14c1ec1-db66-4519-83e7-c1d8ebc20851 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:48:16 compute-1 nova_compute[183083]: 2026-01-26 08:48:16.744 183087 DEBUG nova.compute.manager [req-bd661fe6-1ab5-41a9-851a-b7b3fed9d5cd req-bd40ea14-165f-4db4-9bf0-fbcf854e41bb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Refreshing instance network info cache due to event network-changed-a14c1ec1-db66-4519-83e7-c1d8ebc20851. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:48:16 compute-1 nova_compute[183083]: 2026-01-26 08:48:16.745 183087 DEBUG oslo_concurrency.lockutils [req-bd661fe6-1ab5-41a9-851a-b7b3fed9d5cd req-bd40ea14-165f-4db4-9bf0-fbcf854e41bb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-456bc5cb-4197-47ac-b895-7d4e8b3b5e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:48:16 compute-1 ovn_controller[95352]: 2026-01-26T08:48:16Z|00144|binding|INFO|Releasing lport 2e241887-a928-4a33-98c2-6228ee06108e from this chassis (sb_readonly=0)
Jan 26 08:48:16 compute-1 nova_compute[183083]: 2026-01-26 08:48:16.969 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.013 183087 DEBUG nova.network.neutron [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Updating instance_info_cache with network_info: [{"id": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "address": "fa:16:3e:cb:37:30", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa14c1ec1-db", "ovs_interfaceid": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.040 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Releasing lock "refresh_cache-456bc5cb-4197-47ac-b895-7d4e8b3b5e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.041 183087 DEBUG nova.compute.manager [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Instance network_info: |[{"id": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "address": "fa:16:3e:cb:37:30", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa14c1ec1-db", "ovs_interfaceid": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.041 183087 DEBUG oslo_concurrency.lockutils [req-bd661fe6-1ab5-41a9-851a-b7b3fed9d5cd req-bd40ea14-165f-4db4-9bf0-fbcf854e41bb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-456bc5cb-4197-47ac-b895-7d4e8b3b5e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.041 183087 DEBUG nova.network.neutron [req-bd661fe6-1ab5-41a9-851a-b7b3fed9d5cd req-bd40ea14-165f-4db4-9bf0-fbcf854e41bb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Refreshing network info cache for port a14c1ec1-db66-4519-83e7-c1d8ebc20851 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.046 183087 DEBUG nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Start _get_guest_xml network_info=[{"id": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "address": "fa:16:3e:cb:37:30", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa14c1ec1-db", "ovs_interfaceid": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encryption_options': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.051 183087 WARNING nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.058 183087 DEBUG nova.virt.libvirt.host [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.059 183087 DEBUG nova.virt.libvirt.host [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.066 183087 DEBUG nova.virt.libvirt.host [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.067 183087 DEBUG nova.virt.libvirt.host [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.068 183087 DEBUG nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.068 183087 DEBUG nova.virt.hardware [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T08:43:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7017323-cd35-470d-a718-faa6c6e97277',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.068 183087 DEBUG nova.virt.hardware [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.069 183087 DEBUG nova.virt.hardware [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.069 183087 DEBUG nova.virt.hardware [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.069 183087 DEBUG nova.virt.hardware [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.070 183087 DEBUG nova.virt.hardware [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.070 183087 DEBUG nova.virt.hardware [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.070 183087 DEBUG nova.virt.hardware [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.070 183087 DEBUG nova.virt.hardware [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.071 183087 DEBUG nova.virt.hardware [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.071 183087 DEBUG nova.virt.hardware [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.075 183087 DEBUG nova.virt.libvirt.vif [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:48:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1717515504',display_name='tempest-server-test-1717515504',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1717515504',id=32,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHR6EJ0M3L9T9j8F+dLuTeoyDdnWZCi8ic8NnDlm++GcyV15uFS8BCYLqsqkAatnDxNEdqPcffpohwDfIhW8BevW7ZqTxZGVxsZmagZkz7C/Tn9HOibVv/vjkHnqrvkfkA==',key_name='tempest-keypair-test-713690759',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4a559c36b13649d98b2995c099340eb9',ramdisk_id='',reservation_id='r-lc7rdn7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-508365101',owner_user_name='tempest-PortSecurityTest-508365101-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:48:13Z,user_data=None,user_id='52d582094c584036ba3e04c9da69ee02',uuid=456bc5cb-4197-47ac-b895-7d4e8b3b5e58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "address": "fa:16:3e:cb:37:30", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa14c1ec1-db", "ovs_interfaceid": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.075 183087 DEBUG nova.network.os_vif_util [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converting VIF {"id": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "address": "fa:16:3e:cb:37:30", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa14c1ec1-db", "ovs_interfaceid": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.076 183087 DEBUG nova.network.os_vif_util [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:37:30,bridge_name='br-int',has_traffic_filtering=True,id=a14c1ec1-db66-4519-83e7-c1d8ebc20851,network=Network(9006b908-3439-4b0e-b89f-6a6dbb60f4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa14c1ec1-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.077 183087 DEBUG nova.objects.instance [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 456bc5cb-4197-47ac-b895-7d4e8b3b5e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.169 183087 DEBUG nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] End _get_guest_xml xml=<domain type="kvm">
Jan 26 08:48:17 compute-1 nova_compute[183083]:   <uuid>456bc5cb-4197-47ac-b895-7d4e8b3b5e58</uuid>
Jan 26 08:48:17 compute-1 nova_compute[183083]:   <name>instance-00000020</name>
Jan 26 08:48:17 compute-1 nova_compute[183083]:   <memory>131072</memory>
Jan 26 08:48:17 compute-1 nova_compute[183083]:   <vcpu>1</vcpu>
Jan 26 08:48:17 compute-1 nova_compute[183083]:   <metadata>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <nova:name>tempest-server-test-1717515504</nova:name>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <nova:creationTime>2026-01-26 08:48:17</nova:creationTime>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <nova:flavor name="m1.nano">
Jan 26 08:48:17 compute-1 nova_compute[183083]:         <nova:memory>128</nova:memory>
Jan 26 08:48:17 compute-1 nova_compute[183083]:         <nova:disk>1</nova:disk>
Jan 26 08:48:17 compute-1 nova_compute[183083]:         <nova:swap>0</nova:swap>
Jan 26 08:48:17 compute-1 nova_compute[183083]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 08:48:17 compute-1 nova_compute[183083]:         <nova:vcpus>1</nova:vcpus>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       </nova:flavor>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <nova:owner>
Jan 26 08:48:17 compute-1 nova_compute[183083]:         <nova:user uuid="52d582094c584036ba3e04c9da69ee02">tempest-PortSecurityTest-508365101-project-member</nova:user>
Jan 26 08:48:17 compute-1 nova_compute[183083]:         <nova:project uuid="4a559c36b13649d98b2995c099340eb9">tempest-PortSecurityTest-508365101</nova:project>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       </nova:owner>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <nova:root type="image" uuid="13d1a20a-8003-4f19-aba7-ccbd9eff9b82"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <nova:ports>
Jan 26 08:48:17 compute-1 nova_compute[183083]:         <nova:port uuid="a14c1ec1-db66-4519-83e7-c1d8ebc20851">
Jan 26 08:48:17 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       </nova:ports>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     </nova:instance>
Jan 26 08:48:17 compute-1 nova_compute[183083]:   </metadata>
Jan 26 08:48:17 compute-1 nova_compute[183083]:   <sysinfo type="smbios">
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <system>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <entry name="manufacturer">RDO</entry>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <entry name="product">OpenStack Compute</entry>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <entry name="serial">456bc5cb-4197-47ac-b895-7d4e8b3b5e58</entry>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <entry name="uuid">456bc5cb-4197-47ac-b895-7d4e8b3b5e58</entry>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <entry name="family">Virtual Machine</entry>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     </system>
Jan 26 08:48:17 compute-1 nova_compute[183083]:   </sysinfo>
Jan 26 08:48:17 compute-1 nova_compute[183083]:   <os>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <boot dev="hd"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <smbios mode="sysinfo"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:   </os>
Jan 26 08:48:17 compute-1 nova_compute[183083]:   <features>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <acpi/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <apic/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <vmcoreinfo/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:   </features>
Jan 26 08:48:17 compute-1 nova_compute[183083]:   <clock offset="utc">
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <timer name="hpet" present="no"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:   </clock>
Jan 26 08:48:17 compute-1 nova_compute[183083]:   <cpu mode="host-model" match="exact">
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:   </cpu>
Jan 26 08:48:17 compute-1 nova_compute[183083]:   <devices>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <disk type="file" device="disk">
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/456bc5cb-4197-47ac-b895-7d4e8b3b5e58/disk"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <target dev="vda" bus="virtio"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <disk type="file" device="cdrom">
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/456bc5cb-4197-47ac-b895-7d4e8b3b5e58/disk.config"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <target dev="sda" bus="sata"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:cb:37:30"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <target dev="tapa14c1ec1-db"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <serial type="pty">
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <log file="/var/lib/nova/instances/456bc5cb-4197-47ac-b895-7d4e8b3b5e58/console.log" append="off"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     </serial>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <video>
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     </video>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <input type="tablet" bus="usb"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <rng model="virtio">
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <backend model="random">/dev/urandom</backend>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     </rng>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <controller type="usb" index="0"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     <memballoon model="virtio">
Jan 26 08:48:17 compute-1 nova_compute[183083]:       <stats period="10"/>
Jan 26 08:48:17 compute-1 nova_compute[183083]:     </memballoon>
Jan 26 08:48:17 compute-1 nova_compute[183083]:   </devices>
Jan 26 08:48:17 compute-1 nova_compute[183083]: </domain>
Jan 26 08:48:17 compute-1 nova_compute[183083]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.171 183087 DEBUG nova.compute.manager [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Preparing to wait for external event network-vif-plugged-a14c1ec1-db66-4519-83e7-c1d8ebc20851 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.171 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.172 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.172 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.173 183087 DEBUG nova.virt.libvirt.vif [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:48:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1717515504',display_name='tempest-server-test-1717515504',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1717515504',id=32,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHR6EJ0M3L9T9j8F+dLuTeoyDdnWZCi8ic8NnDlm++GcyV15uFS8BCYLqsqkAatnDxNEdqPcffpohwDfIhW8BevW7ZqTxZGVxsZmagZkz7C/Tn9HOibVv/vjkHnqrvkfkA==',key_name='tempest-keypair-test-713690759',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4a559c36b13649d98b2995c099340eb9',ramdisk_id='',reservation_id='r-lc7rdn7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-508365101',owner_user_name='tempest-PortSecurityTest-508365101-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:48:13Z,user_data=None,user_id='52d582094c584036ba3e04c9da69ee02',uuid=456bc5cb-4197-47ac-b895-7d4e8b3b5e58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "address": "fa:16:3e:cb:37:30", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa14c1ec1-db", "ovs_interfaceid": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.173 183087 DEBUG nova.network.os_vif_util [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converting VIF {"id": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "address": "fa:16:3e:cb:37:30", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa14c1ec1-db", "ovs_interfaceid": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.174 183087 DEBUG nova.network.os_vif_util [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:37:30,bridge_name='br-int',has_traffic_filtering=True,id=a14c1ec1-db66-4519-83e7-c1d8ebc20851,network=Network(9006b908-3439-4b0e-b89f-6a6dbb60f4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa14c1ec1-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.174 183087 DEBUG os_vif [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:37:30,bridge_name='br-int',has_traffic_filtering=True,id=a14c1ec1-db66-4519-83e7-c1d8ebc20851,network=Network(9006b908-3439-4b0e-b89f-6a6dbb60f4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa14c1ec1-db') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.175 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.176 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.176 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.179 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.180 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa14c1ec1-db, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.180 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa14c1ec1-db, col_values=(('external_ids', {'iface-id': 'a14c1ec1-db66-4519-83e7-c1d8ebc20851', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cb:37:30', 'vm-uuid': '456bc5cb-4197-47ac-b895-7d4e8b3b5e58'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.182 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:17 compute-1 NetworkManager[55451]: <info>  [1769417297.1836] manager: (tapa14c1ec1-db): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.184 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.191 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.192 183087 INFO os_vif [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:37:30,bridge_name='br-int',has_traffic_filtering=True,id=a14c1ec1-db66-4519-83e7-c1d8ebc20851,network=Network(9006b908-3439-4b0e-b89f-6a6dbb60f4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa14c1ec1-db')
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.268 183087 DEBUG nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.268 183087 DEBUG nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.269 183087 DEBUG nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] No VIF found with MAC fa:16:3e:cb:37:30, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.270 183087 INFO nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Using config drive
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.603 183087 INFO nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Creating config drive at /var/lib/nova/instances/456bc5cb-4197-47ac-b895-7d4e8b3b5e58/disk.config
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.610 183087 DEBUG oslo_concurrency.processutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/456bc5cb-4197-47ac-b895-7d4e8b3b5e58/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo2x2xxbj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.750 183087 DEBUG oslo_concurrency.processutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/456bc5cb-4197-47ac-b895-7d4e8b3b5e58/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo2x2xxbj" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:48:17 compute-1 kernel: tapa14c1ec1-db: entered promiscuous mode
Jan 26 08:48:17 compute-1 NetworkManager[55451]: <info>  [1769417297.8130] manager: (tapa14c1ec1-db): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Jan 26 08:48:17 compute-1 ovn_controller[95352]: 2026-01-26T08:48:17Z|00145|binding|INFO|Claiming lport a14c1ec1-db66-4519-83e7-c1d8ebc20851 for this chassis.
Jan 26 08:48:17 compute-1 ovn_controller[95352]: 2026-01-26T08:48:17Z|00146|binding|INFO|a14c1ec1-db66-4519-83e7-c1d8ebc20851: Claiming fa:16:3e:cb:37:30 10.100.0.5
Jan 26 08:48:17 compute-1 ovn_controller[95352]: 2026-01-26T08:48:17Z|00147|binding|INFO|a14c1ec1-db66-4519-83e7-c1d8ebc20851: Claiming unknown
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.814 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:17 compute-1 systemd-udevd[215058]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:48:17 compute-1 ovn_controller[95352]: 2026-01-26T08:48:17Z|00148|binding|INFO|Setting lport a14c1ec1-db66-4519-83e7-c1d8ebc20851 ovn-installed in OVS
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.843 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.847 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:17 compute-1 ovn_controller[95352]: 2026-01-26T08:48:17Z|00149|binding|INFO|Setting lport a14c1ec1-db66-4519-83e7-c1d8ebc20851 up in Southbound
Jan 26 08:48:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:17.851 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:37:30 10.100.0.5', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '456bc5cb-4197-47ac-b895-7d4e8b3b5e58', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9006b908-3439-4b0e-b89f-6a6dbb60f4a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4a559c36b13649d98b2995c099340eb9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5229e7d1-fe13-4532-bccb-d60478a3e25e, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=a14c1ec1-db66-4519-83e7-c1d8ebc20851) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:48:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:17.852 104632 INFO neutron.agent.ovn.metadata.agent [-] Port a14c1ec1-db66-4519-83e7-c1d8ebc20851 in datapath 9006b908-3439-4b0e-b89f-6a6dbb60f4a7 bound to our chassis
Jan 26 08:48:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:17.855 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9006b908-3439-4b0e-b89f-6a6dbb60f4a7
Jan 26 08:48:17 compute-1 NetworkManager[55451]: <info>  [1769417297.8572] device (tapa14c1ec1-db): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 08:48:17 compute-1 NetworkManager[55451]: <info>  [1769417297.8584] device (tapa14c1ec1-db): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 08:48:17 compute-1 systemd-machined[154360]: New machine qemu-8-instance-00000020.
Jan 26 08:48:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:17.873 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[49c64c46-a355-47b1-aca8-3df41d8887f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:17 compute-1 systemd[1]: Started Virtual Machine qemu-8-instance-00000020.
Jan 26 08:48:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:17.905 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[feb1aec6-c73b-4583-9a1f-dc95afacfbbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:17.909 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[b7bcad95-cca7-4d9a-a3fb-9e3361648149]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:17.944 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[c3c0e2d6-7147-4352-b13e-2b4200c23254]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:17.968 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[33f8163b-1f29-4f3e-a241-712098c14a53]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9006b908-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:9a:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 15, 'tx_packets': 5, 'rx_bytes': 1022, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 15, 'tx_packets': 5, 'rx_bytes': 1022, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 359477, 'reachable_time': 19080, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 10, 'inoctets': 672, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 10, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 672, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 10, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215075, 'error': None, 'target': 'ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:17.992 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[8dd8e3fd-b1e6-4069-b51e-47b7d9626774]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9006b908-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 359493, 'tstamp': 359493}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215076, 'error': None, 'target': 'ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9006b908-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 359497, 'tstamp': 359497}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215076, 'error': None, 'target': 'ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:17.994 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9006b908-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.996 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:17 compute-1 nova_compute[183083]: 2026-01-26 08:48:17.997 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:17.998 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9006b908-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:17.999 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:48:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:18.000 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9006b908-30, col_values=(('external_ids', {'iface-id': '2e241887-a928-4a33-98c2-6228ee06108e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:18 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:18.000 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:48:18 compute-1 nova_compute[183083]: 2026-01-26 08:48:18.227 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417298.22632, 456bc5cb-4197-47ac-b895-7d4e8b3b5e58 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:48:18 compute-1 nova_compute[183083]: 2026-01-26 08:48:18.227 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] VM Started (Lifecycle Event)
Jan 26 08:48:18 compute-1 nova_compute[183083]: 2026-01-26 08:48:18.255 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:48:18 compute-1 nova_compute[183083]: 2026-01-26 08:48:18.261 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417298.2264857, 456bc5cb-4197-47ac-b895-7d4e8b3b5e58 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:48:18 compute-1 nova_compute[183083]: 2026-01-26 08:48:18.261 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] VM Paused (Lifecycle Event)
Jan 26 08:48:18 compute-1 nova_compute[183083]: 2026-01-26 08:48:18.277 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:48:18 compute-1 nova_compute[183083]: 2026-01-26 08:48:18.283 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:48:18 compute-1 nova_compute[183083]: 2026-01-26 08:48:18.306 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:48:20 compute-1 nova_compute[183083]: 2026-01-26 08:48:20.030 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:20 compute-1 nova_compute[183083]: 2026-01-26 08:48:20.125 183087 DEBUG nova.network.neutron [req-bd661fe6-1ab5-41a9-851a-b7b3fed9d5cd req-bd40ea14-165f-4db4-9bf0-fbcf854e41bb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Updated VIF entry in instance network info cache for port a14c1ec1-db66-4519-83e7-c1d8ebc20851. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:48:20 compute-1 nova_compute[183083]: 2026-01-26 08:48:20.126 183087 DEBUG nova.network.neutron [req-bd661fe6-1ab5-41a9-851a-b7b3fed9d5cd req-bd40ea14-165f-4db4-9bf0-fbcf854e41bb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Updating instance_info_cache with network_info: [{"id": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "address": "fa:16:3e:cb:37:30", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa14c1ec1-db", "ovs_interfaceid": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:20 compute-1 nova_compute[183083]: 2026-01-26 08:48:20.141 183087 DEBUG oslo_concurrency.lockutils [req-bd661fe6-1ab5-41a9-851a-b7b3fed9d5cd req-bd40ea14-165f-4db4-9bf0-fbcf854e41bb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-456bc5cb-4197-47ac-b895-7d4e8b3b5e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:48:20 compute-1 podman[215084]: 2026-01-26 08:48:20.828183316 +0000 UTC m=+0.082907353 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 08:48:20 compute-1 podman[215085]: 2026-01-26 08:48:20.845154537 +0000 UTC m=+0.090802057 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, release=1755695350, distribution-scope=public, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 26 08:48:22 compute-1 nova_compute[183083]: 2026-01-26 08:48:22.184 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:22 compute-1 nova_compute[183083]: 2026-01-26 08:48:22.957 183087 DEBUG nova.compute.manager [req-aa387db6-0ff6-4364-85fe-86b57a83656a req-3f495597-8c25-4da6-8dce-a8713ab7c939 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Received event network-vif-plugged-a14c1ec1-db66-4519-83e7-c1d8ebc20851 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:48:22 compute-1 nova_compute[183083]: 2026-01-26 08:48:22.958 183087 DEBUG oslo_concurrency.lockutils [req-aa387db6-0ff6-4364-85fe-86b57a83656a req-3f495597-8c25-4da6-8dce-a8713ab7c939 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:22 compute-1 nova_compute[183083]: 2026-01-26 08:48:22.959 183087 DEBUG oslo_concurrency.lockutils [req-aa387db6-0ff6-4364-85fe-86b57a83656a req-3f495597-8c25-4da6-8dce-a8713ab7c939 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:22 compute-1 nova_compute[183083]: 2026-01-26 08:48:22.959 183087 DEBUG oslo_concurrency.lockutils [req-aa387db6-0ff6-4364-85fe-86b57a83656a req-3f495597-8c25-4da6-8dce-a8713ab7c939 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:22 compute-1 nova_compute[183083]: 2026-01-26 08:48:22.959 183087 DEBUG nova.compute.manager [req-aa387db6-0ff6-4364-85fe-86b57a83656a req-3f495597-8c25-4da6-8dce-a8713ab7c939 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Processing event network-vif-plugged-a14c1ec1-db66-4519-83e7-c1d8ebc20851 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 08:48:22 compute-1 nova_compute[183083]: 2026-01-26 08:48:22.961 183087 DEBUG nova.compute.manager [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 08:48:22 compute-1 nova_compute[183083]: 2026-01-26 08:48:22.968 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417302.96795, 456bc5cb-4197-47ac-b895-7d4e8b3b5e58 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:48:22 compute-1 nova_compute[183083]: 2026-01-26 08:48:22.969 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] VM Resumed (Lifecycle Event)
Jan 26 08:48:22 compute-1 nova_compute[183083]: 2026-01-26 08:48:22.972 183087 DEBUG nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 08:48:22 compute-1 nova_compute[183083]: 2026-01-26 08:48:22.979 183087 INFO nova.virt.libvirt.driver [-] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Instance spawned successfully.
Jan 26 08:48:22 compute-1 nova_compute[183083]: 2026-01-26 08:48:22.980 183087 DEBUG nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 08:48:22 compute-1 nova_compute[183083]: 2026-01-26 08:48:22.992 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:48:22 compute-1 nova_compute[183083]: 2026-01-26 08:48:22.997 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.011 183087 DEBUG nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.011 183087 DEBUG nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.012 183087 DEBUG nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.013 183087 DEBUG nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.014 183087 DEBUG nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.015 183087 DEBUG nova.virt.libvirt.driver [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.027 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.093 183087 INFO nova.compute.manager [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Took 9.25 seconds to spawn the instance on the hypervisor.
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.093 183087 DEBUG nova.compute.manager [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.167 183087 INFO nova.compute.manager [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Took 10.56 seconds to build instance.
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.190 183087 DEBUG oslo_concurrency.lockutils [None req-2a2059ac-decc-4587-aabd-c41024ae9c5f 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.673s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.312 183087 DEBUG oslo_concurrency.lockutils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "7b62131c-b906-4bf7-b4b0-5d4ae221a523" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.313 183087 DEBUG oslo_concurrency.lockutils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "7b62131c-b906-4bf7-b4b0-5d4ae221a523" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.330 183087 DEBUG nova.compute.manager [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.425 183087 DEBUG oslo_concurrency.lockutils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.425 183087 DEBUG oslo_concurrency.lockutils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.433 183087 DEBUG nova.virt.hardware [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.434 183087 INFO nova.compute.claims [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.615 183087 DEBUG nova.compute.provider_tree [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.631 183087 DEBUG nova.scheduler.client.report [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.654 183087 DEBUG oslo_concurrency.lockutils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.655 183087 DEBUG nova.compute.manager [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.709 183087 DEBUG nova.compute.manager [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.709 183087 DEBUG nova.network.neutron [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.728 183087 INFO nova.virt.libvirt.driver [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.749 183087 DEBUG nova.compute.manager [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.853 183087 DEBUG nova.compute.manager [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.855 183087 DEBUG nova.virt.libvirt.driver [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.856 183087 INFO nova.virt.libvirt.driver [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Creating image(s)
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.857 183087 DEBUG oslo_concurrency.lockutils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "/var/lib/nova/instances/7b62131c-b906-4bf7-b4b0-5d4ae221a523/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.857 183087 DEBUG oslo_concurrency.lockutils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "/var/lib/nova/instances/7b62131c-b906-4bf7-b4b0-5d4ae221a523/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.859 183087 DEBUG oslo_concurrency.lockutils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "/var/lib/nova/instances/7b62131c-b906-4bf7-b4b0-5d4ae221a523/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.859 183087 DEBUG oslo_concurrency.lockutils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:23 compute-1 nova_compute[183083]: 2026-01-26 08:48:23.860 183087 DEBUG oslo_concurrency.lockutils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:24 compute-1 nova_compute[183083]: 2026-01-26 08:48:24.244 183087 DEBUG nova.policy [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e29b04a4a66d43aaa5e5c4f38eeb59c4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5c33bd5e85114c868a4e91d997a5ceec', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:48:24 compute-1 nova_compute[183083]: 2026-01-26 08:48:24.522 183087 INFO nova.compute.manager [None req-76106d4f-c9ce-475e-9eaf-a2a7fbdb6208 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Get console output
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.002 183087 DEBUG oslo_concurrency.lockutils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Traceback (most recent call last):
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]     raise exception.ImageUnacceptable(
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] 
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] During handling of the above exception, another exception occurred:
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] 
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Traceback (most recent call last):
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]     yield resources
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]     created_disks = self._create_and_inject_local_root(
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]     image.cache(fetch_func=fetch_func,
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]     return f(*args, **kwargs)
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523]     raise exception.ImageUnacceptable(
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.003 183087 ERROR nova.compute.manager [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] 
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.079 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.313 183087 DEBUG nova.compute.manager [req-cb860c89-aedb-4ee2-a98a-5c6d95e7cde5 req-1ddb2a5f-4ed8-45a9-a491-2a487d784557 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Received event network-vif-plugged-a14c1ec1-db66-4519-83e7-c1d8ebc20851 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.314 183087 DEBUG oslo_concurrency.lockutils [req-cb860c89-aedb-4ee2-a98a-5c6d95e7cde5 req-1ddb2a5f-4ed8-45a9-a491-2a487d784557 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.314 183087 DEBUG oslo_concurrency.lockutils [req-cb860c89-aedb-4ee2-a98a-5c6d95e7cde5 req-1ddb2a5f-4ed8-45a9-a491-2a487d784557 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.315 183087 DEBUG oslo_concurrency.lockutils [req-cb860c89-aedb-4ee2-a98a-5c6d95e7cde5 req-1ddb2a5f-4ed8-45a9-a491-2a487d784557 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.315 183087 DEBUG nova.compute.manager [req-cb860c89-aedb-4ee2-a98a-5c6d95e7cde5 req-1ddb2a5f-4ed8-45a9-a491-2a487d784557 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] No waiting events found dispatching network-vif-plugged-a14c1ec1-db66-4519-83e7-c1d8ebc20851 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.316 183087 WARNING nova.compute.manager [req-cb860c89-aedb-4ee2-a98a-5c6d95e7cde5 req-1ddb2a5f-4ed8-45a9-a491-2a487d784557 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Received unexpected event network-vif-plugged-a14c1ec1-db66-4519-83e7-c1d8ebc20851 for instance with vm_state active and task_state None.
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.405 183087 DEBUG oslo_concurrency.lockutils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquiring lock "a487e350-10fe-48b6-bab4-2943040cfe31" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.406 183087 DEBUG oslo_concurrency.lockutils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "a487e350-10fe-48b6-bab4-2943040cfe31" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.424 183087 DEBUG nova.compute.manager [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.482 183087 DEBUG oslo_concurrency.lockutils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.482 183087 DEBUG oslo_concurrency.lockutils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.490 183087 DEBUG nova.virt.hardware [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.491 183087 INFO nova.compute.claims [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.702 183087 DEBUG nova.compute.provider_tree [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.720 183087 DEBUG nova.scheduler.client.report [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.748 183087 DEBUG oslo_concurrency.lockutils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.749 183087 DEBUG nova.compute.manager [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.789 183087 DEBUG nova.compute.manager [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.790 183087 DEBUG nova.network.neutron [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.806 183087 DEBUG nova.network.neutron [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Successfully created port: 3a8f4120-8613-4df5-af8d-293dd234ac8c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.820 183087 INFO nova.virt.libvirt.driver [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.836 183087 DEBUG nova.compute.manager [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.915 183087 DEBUG nova.compute.manager [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.917 183087 DEBUG nova.virt.libvirt.driver [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.917 183087 INFO nova.virt.libvirt.driver [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Creating image(s)
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.918 183087 DEBUG oslo_concurrency.lockutils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquiring lock "/var/lib/nova/instances/a487e350-10fe-48b6-bab4-2943040cfe31/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.919 183087 DEBUG oslo_concurrency.lockutils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "/var/lib/nova/instances/a487e350-10fe-48b6-bab4-2943040cfe31/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.920 183087 DEBUG oslo_concurrency.lockutils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "/var/lib/nova/instances/a487e350-10fe-48b6-bab4-2943040cfe31/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.921 183087 DEBUG oslo_concurrency.lockutils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:25 compute-1 nova_compute[183083]: 2026-01-26 08:48:25.922 183087 DEBUG oslo_concurrency.lockutils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:26 compute-1 ovn_controller[95352]: 2026-01-26T08:48:26Z|00150|binding|INFO|Releasing lport 2e241887-a928-4a33-98c2-6228ee06108e from this chassis (sb_readonly=0)
Jan 26 08:48:26 compute-1 nova_compute[183083]: 2026-01-26 08:48:26.574 183087 DEBUG nova.policy [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '10286fb05759476d8b51274239727a28', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9680eb9addb04171b834f7dff8ec602d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:48:26 compute-1 nova_compute[183083]: 2026-01-26 08:48:26.601 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:26 compute-1 podman[215126]: 2026-01-26 08:48:26.840623261 +0000 UTC m=+0.085171508 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 08:48:26 compute-1 podman[215125]: 2026-01-26 08:48:26.931856799 +0000 UTC m=+0.180173833 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.186 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.245 183087 DEBUG nova.network.neutron [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Successfully updated port: 3a8f4120-8613-4df5-af8d-293dd234ac8c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.266 183087 DEBUG oslo_concurrency.lockutils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "refresh_cache-7b62131c-b906-4bf7-b4b0-5d4ae221a523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.267 183087 DEBUG oslo_concurrency.lockutils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquired lock "refresh_cache-7b62131c-b906-4bf7-b4b0-5d4ae221a523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.268 183087 DEBUG nova.network.neutron [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.363 183087 DEBUG oslo_concurrency.lockutils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.441s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Traceback (most recent call last):
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]     raise exception.ImageUnacceptable(
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31] 
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31] During handling of the above exception, another exception occurred:
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31] 
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Traceback (most recent call last):
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]     yield resources
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]     created_disks = self._create_and_inject_local_root(
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]     image.cache(fetch_func=fetch_func,
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]     return f(*args, **kwargs)
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31]     raise exception.ImageUnacceptable(
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.364 183087 ERROR nova.compute.manager [instance: a487e350-10fe-48b6-bab4-2943040cfe31] 
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.443 183087 DEBUG nova.compute.manager [req-ef516ace-d80c-4176-abf5-d9bcd558c5e8 req-325042a5-e53a-46c2-8236-9fc3008c0520 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Received event network-changed-3a8f4120-8613-4df5-af8d-293dd234ac8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.444 183087 DEBUG nova.compute.manager [req-ef516ace-d80c-4176-abf5-d9bcd558c5e8 req-325042a5-e53a-46c2-8236-9fc3008c0520 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Refreshing instance network info cache due to event network-changed-3a8f4120-8613-4df5-af8d-293dd234ac8c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.445 183087 DEBUG oslo_concurrency.lockutils [req-ef516ace-d80c-4176-abf5-d9bcd558c5e8 req-325042a5-e53a-46c2-8236-9fc3008c0520 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-7b62131c-b906-4bf7-b4b0-5d4ae221a523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:48:27 compute-1 nova_compute[183083]: 2026-01-26 08:48:27.558 183087 DEBUG nova.network.neutron [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:48:27 compute-1 podman[215176]: 2026-01-26 08:48:27.791616503 +0000 UTC m=+0.059711295 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 08:48:28 compute-1 nova_compute[183083]: 2026-01-26 08:48:28.013 183087 DEBUG nova.network.neutron [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Successfully updated port: 8b28e6c6-abee-4571-a151-9d17ff3ecc86 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:48:28 compute-1 nova_compute[183083]: 2026-01-26 08:48:28.901 183087 DEBUG nova.compute.manager [req-c7a5708f-4cff-433f-b459-93410bc240a9 req-907dfeec-ebd2-476c-9b84-38cab0ae31dd 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Received event network-changed-8b28e6c6-abee-4571-a151-9d17ff3ecc86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:48:28 compute-1 nova_compute[183083]: 2026-01-26 08:48:28.902 183087 DEBUG nova.compute.manager [req-c7a5708f-4cff-433f-b459-93410bc240a9 req-907dfeec-ebd2-476c-9b84-38cab0ae31dd 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Refreshing instance network info cache due to event network-changed-8b28e6c6-abee-4571-a151-9d17ff3ecc86. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:48:28 compute-1 nova_compute[183083]: 2026-01-26 08:48:28.903 183087 DEBUG oslo_concurrency.lockutils [req-c7a5708f-4cff-433f-b459-93410bc240a9 req-907dfeec-ebd2-476c-9b84-38cab0ae31dd 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-a487e350-10fe-48b6-bab4-2943040cfe31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:48:28 compute-1 nova_compute[183083]: 2026-01-26 08:48:28.904 183087 DEBUG oslo_concurrency.lockutils [req-c7a5708f-4cff-433f-b459-93410bc240a9 req-907dfeec-ebd2-476c-9b84-38cab0ae31dd 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-a487e350-10fe-48b6-bab4-2943040cfe31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:48:28 compute-1 nova_compute[183083]: 2026-01-26 08:48:28.904 183087 DEBUG nova.network.neutron [req-c7a5708f-4cff-433f-b459-93410bc240a9 req-907dfeec-ebd2-476c-9b84-38cab0ae31dd 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Refreshing network info cache for port 8b28e6c6-abee-4571-a151-9d17ff3ecc86 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.285 183087 DEBUG nova.network.neutron [req-c7a5708f-4cff-433f-b459-93410bc240a9 req-907dfeec-ebd2-476c-9b84-38cab0ae31dd 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.405 183087 DEBUG nova.network.neutron [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Updating instance_info_cache with network_info: [{"id": "3a8f4120-8613-4df5-af8d-293dd234ac8c", "address": "fa:16:3e:98:b3:92", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.217", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a8f4120-86", "ovs_interfaceid": "3a8f4120-8613-4df5-af8d-293dd234ac8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.480 183087 DEBUG oslo_concurrency.lockutils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Releasing lock "refresh_cache-7b62131c-b906-4bf7-b4b0-5d4ae221a523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.481 183087 DEBUG nova.compute.manager [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Instance network_info: |[{"id": "3a8f4120-8613-4df5-af8d-293dd234ac8c", "address": "fa:16:3e:98:b3:92", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.217", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a8f4120-86", "ovs_interfaceid": "3a8f4120-8613-4df5-af8d-293dd234ac8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.481 183087 DEBUG oslo_concurrency.lockutils [req-ef516ace-d80c-4176-abf5-d9bcd558c5e8 req-325042a5-e53a-46c2-8236-9fc3008c0520 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-7b62131c-b906-4bf7-b4b0-5d4ae221a523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.482 183087 DEBUG nova.network.neutron [req-ef516ace-d80c-4176-abf5-d9bcd558c5e8 req-325042a5-e53a-46c2-8236-9fc3008c0520 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Refreshing network info cache for port 3a8f4120-8613-4df5-af8d-293dd234ac8c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.483 183087 INFO nova.compute.manager [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Terminating instance
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.484 183087 DEBUG nova.compute.manager [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.489 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.489 183087 INFO nova.virt.libvirt.driver [-] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Instance destroyed successfully.
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.490 183087 DEBUG nova.virt.libvirt.vif [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:48:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_multicast_north_south-278363712',display_name='tempest-test_multicast_north_south-278363712',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-multicast-north-south-278363712',id=33,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJOJsa8zDc5tOBfBRLm0Qi812u7HVOO6E7MXAGpKZ4/7op/PkClYPLhHmkNgkuxZ09O67J1SPnUd6CxZdN+euoFRi16VYgHaqYzJwS9D1WUns9/BQk7M5SX/0drgiuby9w==',key_name='tempest-keypair-test-1151010750',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c33bd5e85114c868a4e91d997a5ceec',ramdisk_id='',reservation_id='r-578yve75',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-635971062',owner_user_name='tempest-MulticastTestIPv4Ovn-635971062-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:48:23Z,user_data=None,user_id='e29b04a4a66d43aaa5e5c4f38eeb59c4',uuid=7b62131c-b906-4bf7-b4b0-5d4ae221a523,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3a8f4120-8613-4df5-af8d-293dd234ac8c", "address": "fa:16:3e:98:b3:92", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.217", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a8f4120-86", "ovs_interfaceid": "3a8f4120-8613-4df5-af8d-293dd234ac8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.490 183087 DEBUG nova.network.os_vif_util [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Converting VIF {"id": "3a8f4120-8613-4df5-af8d-293dd234ac8c", "address": "fa:16:3e:98:b3:92", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.217", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a8f4120-86", "ovs_interfaceid": "3a8f4120-8613-4df5-af8d-293dd234ac8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.491 183087 DEBUG nova.network.os_vif_util [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:b3:92,bridge_name='br-int',has_traffic_filtering=True,id=3a8f4120-8613-4df5-af8d-293dd234ac8c,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a8f4120-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.491 183087 DEBUG os_vif [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:b3:92,bridge_name='br-int',has_traffic_filtering=True,id=3a8f4120-8613-4df5-af8d-293dd234ac8c,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a8f4120-86') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.493 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.493 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a8f4120-86, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.493 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.495 183087 INFO os_vif [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:b3:92,bridge_name='br-int',has_traffic_filtering=True,id=3a8f4120-8613-4df5-af8d-293dd234ac8c,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a8f4120-86')
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.496 183087 INFO nova.virt.libvirt.driver [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Deleting instance files /var/lib/nova/instances/7b62131c-b906-4bf7-b4b0-5d4ae221a523_del
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.496 183087 INFO nova.virt.libvirt.driver [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Deletion of /var/lib/nova/instances/7b62131c-b906-4bf7-b4b0-5d4ae221a523_del complete
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.571 183087 INFO nova.compute.manager [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Took 0.09 seconds to destroy the instance on the hypervisor.
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.572 183087 DEBUG nova.compute.claims [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c981262b0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.573 183087 DEBUG oslo_concurrency.lockutils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.573 183087 DEBUG oslo_concurrency.lockutils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.722 183087 DEBUG nova.network.neutron [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Successfully updated port: 7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.738 183087 DEBUG oslo_concurrency.lockutils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquiring lock "refresh_cache-a487e350-10fe-48b6-bab4-2943040cfe31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.762 183087 DEBUG nova.compute.provider_tree [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.779 183087 DEBUG nova.scheduler.client.report [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.800 183087 DEBUG oslo_concurrency.lockutils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.801 183087 DEBUG nova.compute.utils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.802 183087 ERROR nova.compute.manager [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Build of instance 7b62131c-b906-4bf7-b4b0-5d4ae221a523 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 7b62131c-b906-4bf7-b4b0-5d4ae221a523 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.802 183087 DEBUG nova.compute.manager [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.803 183087 DEBUG nova.virt.libvirt.vif [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:48:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_multicast_north_south-278363712',display_name='tempest-test_multicast_north_south-278363712',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-test-multicast-north-south-278363712',id=33,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJOJsa8zDc5tOBfBRLm0Qi812u7HVOO6E7MXAGpKZ4/7op/PkClYPLhHmkNgkuxZ09O67J1SPnUd6CxZdN+euoFRi16VYgHaqYzJwS9D1WUns9/BQk7M5SX/0drgiuby9w==',key_name='tempest-keypair-test-1151010750',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c33bd5e85114c868a4e91d997a5ceec',ramdisk_id='',reservation_id='r-578yve75',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-635971062',owner_user_name='tempest-MulticastTestIPv4Ovn-635971062-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:48:29Z,user_data=None,user_id='e29b04a4a66d43aaa5e5c4f38eeb59c4',uuid=7b62131c-b906-4bf7-b4b0-5d4ae221a523,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3a8f4120-8613-4df5-af8d-293dd234ac8c", "address": "fa:16:3e:98:b3:92", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.217", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a8f4120-86", "ovs_interfaceid": "3a8f4120-8613-4df5-af8d-293dd234ac8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.803 183087 DEBUG nova.network.os_vif_util [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Converting VIF {"id": "3a8f4120-8613-4df5-af8d-293dd234ac8c", "address": "fa:16:3e:98:b3:92", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.217", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a8f4120-86", "ovs_interfaceid": "3a8f4120-8613-4df5-af8d-293dd234ac8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.804 183087 DEBUG nova.network.os_vif_util [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:b3:92,bridge_name='br-int',has_traffic_filtering=True,id=3a8f4120-8613-4df5-af8d-293dd234ac8c,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a8f4120-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.804 183087 DEBUG os_vif [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:b3:92,bridge_name='br-int',has_traffic_filtering=True,id=3a8f4120-8613-4df5-af8d-293dd234ac8c,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a8f4120-86') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.805 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.805 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a8f4120-86, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.806 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.808 183087 INFO nova.compute.manager [None req-698b7050-d0ed-49e5-856a-353b72a7de82 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Get console output
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.809 183087 INFO os_vif [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:b3:92,bridge_name='br-int',has_traffic_filtering=True,id=3a8f4120-8613-4df5-af8d-293dd234ac8c,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a8f4120-86')
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.810 183087 DEBUG nova.compute.manager [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.810 183087 DEBUG nova.compute.manager [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:48:29 compute-1 nova_compute[183083]: 2026-01-26 08:48:29.810 183087 DEBUG nova.network.neutron [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:48:30 compute-1 nova_compute[183083]: 2026-01-26 08:48:30.137 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:30 compute-1 nova_compute[183083]: 2026-01-26 08:48:30.163 183087 DEBUG nova.network.neutron [req-c7a5708f-4cff-433f-b459-93410bc240a9 req-907dfeec-ebd2-476c-9b84-38cab0ae31dd 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:30 compute-1 nova_compute[183083]: 2026-01-26 08:48:30.191 183087 DEBUG oslo_concurrency.lockutils [req-c7a5708f-4cff-433f-b459-93410bc240a9 req-907dfeec-ebd2-476c-9b84-38cab0ae31dd 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-a487e350-10fe-48b6-bab4-2943040cfe31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:48:30 compute-1 nova_compute[183083]: 2026-01-26 08:48:30.192 183087 DEBUG oslo_concurrency.lockutils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquired lock "refresh_cache-a487e350-10fe-48b6-bab4-2943040cfe31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:48:30 compute-1 nova_compute[183083]: 2026-01-26 08:48:30.192 183087 DEBUG nova.network.neutron [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:48:30 compute-1 nova_compute[183083]: 2026-01-26 08:48:30.415 183087 DEBUG nova.network.neutron [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:48:31 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:31.099 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:48:31 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:31.100 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:48:31 compute-1 nova_compute[183083]: 2026-01-26 08:48:31.100 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:31 compute-1 nova_compute[183083]: 2026-01-26 08:48:31.213 183087 DEBUG nova.compute.manager [req-be62a868-4248-4250-8d59-4950f58a79c2 req-5f0df2df-d5ba-44d0-a709-a8257541290d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Received event network-changed-7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:48:31 compute-1 nova_compute[183083]: 2026-01-26 08:48:31.214 183087 DEBUG nova.compute.manager [req-be62a868-4248-4250-8d59-4950f58a79c2 req-5f0df2df-d5ba-44d0-a709-a8257541290d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Refreshing instance network info cache due to event network-changed-7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:48:31 compute-1 nova_compute[183083]: 2026-01-26 08:48:31.214 183087 DEBUG oslo_concurrency.lockutils [req-be62a868-4248-4250-8d59-4950f58a79c2 req-5f0df2df-d5ba-44d0-a709-a8257541290d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-a487e350-10fe-48b6-bab4-2943040cfe31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:48:31 compute-1 nova_compute[183083]: 2026-01-26 08:48:31.421 183087 DEBUG nova.network.neutron [req-ef516ace-d80c-4176-abf5-d9bcd558c5e8 req-325042a5-e53a-46c2-8236-9fc3008c0520 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Updated VIF entry in instance network info cache for port 3a8f4120-8613-4df5-af8d-293dd234ac8c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:48:31 compute-1 nova_compute[183083]: 2026-01-26 08:48:31.422 183087 DEBUG nova.network.neutron [req-ef516ace-d80c-4176-abf5-d9bcd558c5e8 req-325042a5-e53a-46c2-8236-9fc3008c0520 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Updating instance_info_cache with network_info: [{"id": "3a8f4120-8613-4df5-af8d-293dd234ac8c", "address": "fa:16:3e:98:b3:92", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.217", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a8f4120-86", "ovs_interfaceid": "3a8f4120-8613-4df5-af8d-293dd234ac8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:31 compute-1 nova_compute[183083]: 2026-01-26 08:48:31.444 183087 DEBUG oslo_concurrency.lockutils [req-ef516ace-d80c-4176-abf5-d9bcd558c5e8 req-325042a5-e53a-46c2-8236-9fc3008c0520 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-7b62131c-b906-4bf7-b4b0-5d4ae221a523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:48:31 compute-1 nova_compute[183083]: 2026-01-26 08:48:31.917 183087 DEBUG nova.network.neutron [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:31 compute-1 nova_compute[183083]: 2026-01-26 08:48:31.939 183087 INFO nova.compute.manager [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] [instance: 7b62131c-b906-4bf7-b4b0-5d4ae221a523] Took 2.13 seconds to deallocate network for instance.
Jan 26 08:48:32 compute-1 nova_compute[183083]: 2026-01-26 08:48:32.181 183087 INFO nova.scheduler.client.report [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Deleted allocations for instance 7b62131c-b906-4bf7-b4b0-5d4ae221a523
Jan 26 08:48:32 compute-1 nova_compute[183083]: 2026-01-26 08:48:32.182 183087 DEBUG oslo_concurrency.lockutils [None req-826e69da-9723-409e-9f95-0aa69400b108 e29b04a4a66d43aaa5e5c4f38eeb59c4 5c33bd5e85114c868a4e91d997a5ceec - - default default] Lock "7b62131c-b906-4bf7-b4b0-5d4ae221a523" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.869s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:32 compute-1 nova_compute[183083]: 2026-01-26 08:48:32.188 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:34.102 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:34 compute-1 ovn_controller[95352]: 2026-01-26T08:48:34Z|00151|binding|INFO|Releasing lport 2e241887-a928-4a33-98c2-6228ee06108e from this chassis (sb_readonly=0)
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.321 183087 DEBUG nova.network.neutron [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Updating instance_info_cache with network_info: [{"id": "8b28e6c6-abee-4571-a151-9d17ff3ecc86", "address": "fa:16:3e:c9:6b:81", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b28e6c6-ab", "ovs_interfaceid": "8b28e6c6-abee-4571-a151-9d17ff3ecc86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064", "address": "fa:16:3e:62:9b:c6", "network": {"id": "622238ee-5fcf-4d73-ae82-7f218e8bb199", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::39a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d6b3e9f-2a", "ovs_interfaceid": "7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.340 183087 DEBUG oslo_concurrency.lockutils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Releasing lock "refresh_cache-a487e350-10fe-48b6-bab4-2943040cfe31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.340 183087 DEBUG nova.compute.manager [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Instance network_info: |[{"id": "8b28e6c6-abee-4571-a151-9d17ff3ecc86", "address": "fa:16:3e:c9:6b:81", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b28e6c6-ab", "ovs_interfaceid": "8b28e6c6-abee-4571-a151-9d17ff3ecc86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064", "address": "fa:16:3e:62:9b:c6", "network": {"id": "622238ee-5fcf-4d73-ae82-7f218e8bb199", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::39a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d6b3e9f-2a", "ovs_interfaceid": "7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.341 183087 DEBUG oslo_concurrency.lockutils [req-be62a868-4248-4250-8d59-4950f58a79c2 req-5f0df2df-d5ba-44d0-a709-a8257541290d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-a487e350-10fe-48b6-bab4-2943040cfe31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.342 183087 DEBUG nova.network.neutron [req-be62a868-4248-4250-8d59-4950f58a79c2 req-5f0df2df-d5ba-44d0-a709-a8257541290d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Refreshing network info cache for port 7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.344 183087 INFO nova.compute.manager [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Terminating instance
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.346 183087 DEBUG nova.compute.manager [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.352 183087 DEBUG nova.virt.libvirt.driver [-] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.353 183087 INFO nova.virt.libvirt.driver [-] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Instance destroyed successfully.
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.355 183087 DEBUG nova.virt.libvirt.vif [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:48:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-405827926-0',display_name='server-tempest-MultiPortVlanTransparencyTest-405827926-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-405827926-0',id=34,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLNC0niFIeKCfmMP15K31NI0Yt2/3arIz4qkOmJrDjDcT91Y+Z8t7cONdYZJsdWgcENTB4qxVqhqRFnFV/+0btrVE9vH3aNVT4hvrQf4Wf4VfrioZkKAghvtRLwxtntaPQ==',key_name='tempest-MultiPortVlanTransparencyTest-405827926',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9680eb9addb04171b834f7dff8ec602d',ramdisk_id='',reservation_id='r-s7c71ifk',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-179273341',owner_user_name='tempest-MultiPortVlanTransparencyTest-179273341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:48:25Z,user_data=None,user_id='10286fb05759476d8b51274239727a28',uuid=a487e350-10fe-48b6-bab4-2943040cfe31,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b28e6c6-abee-4571-a151-9d17ff3ecc86", "address": "fa:16:3e:c9:6b:81", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b28e6c6-ab", "ovs_interfaceid": "8b28e6c6-abee-4571-a151-9d17ff3ecc86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.355 183087 DEBUG nova.network.os_vif_util [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converting VIF {"id": "8b28e6c6-abee-4571-a151-9d17ff3ecc86", "address": "fa:16:3e:c9:6b:81", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b28e6c6-ab", "ovs_interfaceid": "8b28e6c6-abee-4571-a151-9d17ff3ecc86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.357 183087 DEBUG nova.network.os_vif_util [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:6b:81,bridge_name='br-int',has_traffic_filtering=True,id=8b28e6c6-abee-4571-a151-9d17ff3ecc86,network=Network(65753891-6165-49db-aedc-996760bb86fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b28e6c6-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.358 183087 DEBUG os_vif [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:6b:81,bridge_name='br-int',has_traffic_filtering=True,id=8b28e6c6-abee-4571-a151-9d17ff3ecc86,network=Network(65753891-6165-49db-aedc-996760bb86fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b28e6c6-ab') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.361 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.362 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b28e6c6-ab, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.362 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.369 183087 INFO os_vif [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:6b:81,bridge_name='br-int',has_traffic_filtering=True,id=8b28e6c6-abee-4571-a151-9d17ff3ecc86,network=Network(65753891-6165-49db-aedc-996760bb86fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b28e6c6-ab')
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.371 183087 DEBUG nova.virt.libvirt.vif [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:48:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-405827926-0',display_name='server-tempest-MultiPortVlanTransparencyTest-405827926-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-405827926-0',id=34,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLNC0niFIeKCfmMP15K31NI0Yt2/3arIz4qkOmJrDjDcT91Y+Z8t7cONdYZJsdWgcENTB4qxVqhqRFnFV/+0btrVE9vH3aNVT4hvrQf4Wf4VfrioZkKAghvtRLwxtntaPQ==',key_name='tempest-MultiPortVlanTransparencyTest-405827926',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9680eb9addb04171b834f7dff8ec602d',ramdisk_id='',reservation_id='r-s7c71ifk',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-179273341',owner_user_name='tempest-MultiPortVlanTransparencyTest-179273341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:48:25Z,user_data=None,user_id='10286fb05759476d8b51274239727a28',uuid=a487e350-10fe-48b6-bab4-2943040cfe31,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064", "address": "fa:16:3e:62:9b:c6", "network": {"id": "622238ee-5fcf-4d73-ae82-7f218e8bb199", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::39a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d6b3e9f-2a", "ovs_interfaceid": "7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.372 183087 DEBUG nova.network.os_vif_util [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converting VIF {"id": "7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064", "address": "fa:16:3e:62:9b:c6", "network": {"id": "622238ee-5fcf-4d73-ae82-7f218e8bb199", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::39a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d6b3e9f-2a", "ovs_interfaceid": "7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.373 183087 DEBUG nova.network.os_vif_util [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:9b:c6,bridge_name='br-int',has_traffic_filtering=True,id=7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064,network=Network(622238ee-5fcf-4d73-ae82-7f218e8bb199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7d6b3e9f-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.374 183087 DEBUG os_vif [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:9b:c6,bridge_name='br-int',has_traffic_filtering=True,id=7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064,network=Network(622238ee-5fcf-4d73-ae82-7f218e8bb199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7d6b3e9f-2a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.376 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.377 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d6b3e9f-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.378 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.381 183087 INFO os_vif [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:9b:c6,bridge_name='br-int',has_traffic_filtering=True,id=7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064,network=Network(622238ee-5fcf-4d73-ae82-7f218e8bb199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7d6b3e9f-2a')
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.382 183087 INFO nova.virt.libvirt.driver [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Deleting instance files /var/lib/nova/instances/a487e350-10fe-48b6-bab4-2943040cfe31_del
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.383 183087 INFO nova.virt.libvirt.driver [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Deletion of /var/lib/nova/instances/a487e350-10fe-48b6-bab4-2943040cfe31_del complete
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.428 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.459 183087 INFO nova.compute.manager [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Took 0.11 seconds to destroy the instance on the hypervisor.
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.461 183087 DEBUG nova.compute.claims [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c9850a790> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.461 183087 DEBUG oslo_concurrency.lockutils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.462 183087 DEBUG oslo_concurrency.lockutils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.600 183087 DEBUG nova.compute.provider_tree [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.613 183087 DEBUG nova.scheduler.client.report [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.630 183087 DEBUG oslo_concurrency.lockutils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.632 183087 DEBUG nova.compute.utils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.632 183087 ERROR nova.compute.manager [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Build of instance a487e350-10fe-48b6-bab4-2943040cfe31 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance a487e350-10fe-48b6-bab4-2943040cfe31 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.633 183087 DEBUG nova.compute.manager [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.634 183087 DEBUG nova.virt.libvirt.vif [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:48:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-405827926-0',display_name='server-tempest-MultiPortVlanTransparencyTest-405827926-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='server-tempest-multiportvlantransparencytest-405827926-0',id=34,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLNC0niFIeKCfmMP15K31NI0Yt2/3arIz4qkOmJrDjDcT91Y+Z8t7cONdYZJsdWgcENTB4qxVqhqRFnFV/+0btrVE9vH3aNVT4hvrQf4Wf4VfrioZkKAghvtRLwxtntaPQ==',key_name='tempest-MultiPortVlanTransparencyTest-405827926',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9680eb9addb04171b834f7dff8ec602d',ramdisk_id='',reservation_id='r-s7c71ifk',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-179273341',owner_user_name='tempest-MultiPortVlanTransparencyTest-179273341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:48:34Z,user_data=None,user_id='10286fb05759476d8b51274239727a28',uuid=a487e350-10fe-48b6-bab4-2943040cfe31,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b28e6c6-abee-4571-a151-9d17ff3ecc86", "address": "fa:16:3e:c9:6b:81", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b28e6c6-ab", "ovs_interfaceid": "8b28e6c6-abee-4571-a151-9d17ff3ecc86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.634 183087 DEBUG nova.network.os_vif_util [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converting VIF {"id": "8b28e6c6-abee-4571-a151-9d17ff3ecc86", "address": "fa:16:3e:c9:6b:81", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b28e6c6-ab", "ovs_interfaceid": "8b28e6c6-abee-4571-a151-9d17ff3ecc86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.635 183087 DEBUG nova.network.os_vif_util [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:6b:81,bridge_name='br-int',has_traffic_filtering=True,id=8b28e6c6-abee-4571-a151-9d17ff3ecc86,network=Network(65753891-6165-49db-aedc-996760bb86fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b28e6c6-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.636 183087 DEBUG os_vif [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:6b:81,bridge_name='br-int',has_traffic_filtering=True,id=8b28e6c6-abee-4571-a151-9d17ff3ecc86,network=Network(65753891-6165-49db-aedc-996760bb86fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b28e6c6-ab') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.637 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.637 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b28e6c6-ab, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.638 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.639 183087 INFO os_vif [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:6b:81,bridge_name='br-int',has_traffic_filtering=True,id=8b28e6c6-abee-4571-a151-9d17ff3ecc86,network=Network(65753891-6165-49db-aedc-996760bb86fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b28e6c6-ab')
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.640 183087 DEBUG nova.virt.libvirt.vif [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:48:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-405827926-0',display_name='server-tempest-MultiPortVlanTransparencyTest-405827926-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='server-tempest-multiportvlantransparencytest-405827926-0',id=34,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLNC0niFIeKCfmMP15K31NI0Yt2/3arIz4qkOmJrDjDcT91Y+Z8t7cONdYZJsdWgcENTB4qxVqhqRFnFV/+0btrVE9vH3aNVT4hvrQf4Wf4VfrioZkKAghvtRLwxtntaPQ==',key_name='tempest-MultiPortVlanTransparencyTest-405827926',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9680eb9addb04171b834f7dff8ec602d',ramdisk_id='',reservation_id='r-s7c71ifk',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-179273341',owner_user_name='tempest-MultiPortVlanTransparencyTest-179273341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:48:34Z,user_data=None,user_id='10286fb05759476d8b51274239727a28',uuid=a487e350-10fe-48b6-bab4-2943040cfe31,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064", "address": "fa:16:3e:62:9b:c6", "network": {"id": "622238ee-5fcf-4d73-ae82-7f218e8bb199", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::39a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d6b3e9f-2a", "ovs_interfaceid": "7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.641 183087 DEBUG nova.network.os_vif_util [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converting VIF {"id": "7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064", "address": "fa:16:3e:62:9b:c6", "network": {"id": "622238ee-5fcf-4d73-ae82-7f218e8bb199", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::39a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d6b3e9f-2a", "ovs_interfaceid": "7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.641 183087 DEBUG nova.network.os_vif_util [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:9b:c6,bridge_name='br-int',has_traffic_filtering=True,id=7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064,network=Network(622238ee-5fcf-4d73-ae82-7f218e8bb199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7d6b3e9f-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.642 183087 DEBUG os_vif [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:9b:c6,bridge_name='br-int',has_traffic_filtering=True,id=7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064,network=Network(622238ee-5fcf-4d73-ae82-7f218e8bb199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7d6b3e9f-2a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.643 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.643 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d6b3e9f-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.644 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.645 183087 INFO os_vif [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:9b:c6,bridge_name='br-int',has_traffic_filtering=True,id=7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064,network=Network(622238ee-5fcf-4d73-ae82-7f218e8bb199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7d6b3e9f-2a')
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.646 183087 DEBUG nova.compute.manager [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.646 183087 DEBUG nova.compute.manager [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.646 183087 DEBUG nova.network.neutron [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:48:34 compute-1 ovn_controller[95352]: 2026-01-26T08:48:34Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cb:37:30 10.100.0.5
Jan 26 08:48:34 compute-1 ovn_controller[95352]: 2026-01-26T08:48:34Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cb:37:30 10.100.0.5
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.970 183087 INFO nova.compute.manager [None req-b15f7ae2-7487-448b-8f62-5696cf990595 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Get console output
Jan 26 08:48:34 compute-1 nova_compute[183083]: 2026-01-26 08:48:34.976 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:48:35 compute-1 nova_compute[183083]: 2026-01-26 08:48:35.174 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:36 compute-1 nova_compute[183083]: 2026-01-26 08:48:36.467 183087 DEBUG nova.compute.manager [req-80cbfea0-5560-44a5-9faf-c8b81b829c10 req-aa29d31d-efd1-4c1a-b58c-00c2ea3ca7c1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Received event network-changed-a14c1ec1-db66-4519-83e7-c1d8ebc20851 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:48:36 compute-1 nova_compute[183083]: 2026-01-26 08:48:36.467 183087 DEBUG nova.compute.manager [req-80cbfea0-5560-44a5-9faf-c8b81b829c10 req-aa29d31d-efd1-4c1a-b58c-00c2ea3ca7c1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Refreshing instance network info cache due to event network-changed-a14c1ec1-db66-4519-83e7-c1d8ebc20851. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:48:36 compute-1 nova_compute[183083]: 2026-01-26 08:48:36.468 183087 DEBUG oslo_concurrency.lockutils [req-80cbfea0-5560-44a5-9faf-c8b81b829c10 req-aa29d31d-efd1-4c1a-b58c-00c2ea3ca7c1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-456bc5cb-4197-47ac-b895-7d4e8b3b5e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:48:36 compute-1 nova_compute[183083]: 2026-01-26 08:48:36.469 183087 DEBUG oslo_concurrency.lockutils [req-80cbfea0-5560-44a5-9faf-c8b81b829c10 req-aa29d31d-efd1-4c1a-b58c-00c2ea3ca7c1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-456bc5cb-4197-47ac-b895-7d4e8b3b5e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:48:36 compute-1 nova_compute[183083]: 2026-01-26 08:48:36.469 183087 DEBUG nova.network.neutron [req-80cbfea0-5560-44a5-9faf-c8b81b829c10 req-aa29d31d-efd1-4c1a-b58c-00c2ea3ca7c1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Refreshing network info cache for port a14c1ec1-db66-4519-83e7-c1d8ebc20851 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:48:36 compute-1 ovn_controller[95352]: 2026-01-26T08:48:36Z|00152|binding|INFO|Releasing lport 2e241887-a928-4a33-98c2-6228ee06108e from this chassis (sb_readonly=0)
Jan 26 08:48:36 compute-1 nova_compute[183083]: 2026-01-26 08:48:36.724 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:37 compute-1 nova_compute[183083]: 2026-01-26 08:48:37.190 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:37 compute-1 nova_compute[183083]: 2026-01-26 08:48:37.523 183087 DEBUG nova.network.neutron [req-be62a868-4248-4250-8d59-4950f58a79c2 req-5f0df2df-d5ba-44d0-a709-a8257541290d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Updated VIF entry in instance network info cache for port 7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:48:37 compute-1 nova_compute[183083]: 2026-01-26 08:48:37.524 183087 DEBUG nova.network.neutron [req-be62a868-4248-4250-8d59-4950f58a79c2 req-5f0df2df-d5ba-44d0-a709-a8257541290d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Updating instance_info_cache with network_info: [{"id": "8b28e6c6-abee-4571-a151-9d17ff3ecc86", "address": "fa:16:3e:c9:6b:81", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b28e6c6-ab", "ovs_interfaceid": "8b28e6c6-abee-4571-a151-9d17ff3ecc86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064", "address": "fa:16:3e:62:9b:c6", "network": {"id": "622238ee-5fcf-4d73-ae82-7f218e8bb199", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::39a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d6b3e9f-2a", "ovs_interfaceid": "7d6b3e9f-2aa2-4f8b-88f4-04cdeeb8b064", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:37 compute-1 nova_compute[183083]: 2026-01-26 08:48:37.546 183087 DEBUG oslo_concurrency.lockutils [req-be62a868-4248-4250-8d59-4950f58a79c2 req-5f0df2df-d5ba-44d0-a709-a8257541290d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-a487e350-10fe-48b6-bab4-2943040cfe31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:48:37 compute-1 sshd-session[215209]: Accepted publickey for zuul from 38.102.83.66 port 33844 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:48:37 compute-1 systemd-logind[788]: New session 36 of user zuul.
Jan 26 08:48:37 compute-1 systemd[1]: Started Session 36 of User zuul.
Jan 26 08:48:37 compute-1 sshd-session[215209]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:48:38 compute-1 podman[215211]: 2026-01-26 08:48:38.026254686 +0000 UTC m=+0.066832098 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 08:48:38 compute-1 sshd-session[215223]: Connection closed by 38.102.83.66 port 33844
Jan 26 08:48:38 compute-1 sshd-session[215209]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:48:38 compute-1 systemd[1]: session-36.scope: Deactivated successfully.
Jan 26 08:48:38 compute-1 systemd-logind[788]: Session 36 logged out. Waiting for processes to exit.
Jan 26 08:48:38 compute-1 systemd-logind[788]: Removed session 36.
Jan 26 08:48:38 compute-1 nova_compute[183083]: 2026-01-26 08:48:38.933 183087 DEBUG nova.network.neutron [req-80cbfea0-5560-44a5-9faf-c8b81b829c10 req-aa29d31d-efd1-4c1a-b58c-00c2ea3ca7c1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Updated VIF entry in instance network info cache for port a14c1ec1-db66-4519-83e7-c1d8ebc20851. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:48:38 compute-1 nova_compute[183083]: 2026-01-26 08:48:38.935 183087 DEBUG nova.network.neutron [req-80cbfea0-5560-44a5-9faf-c8b81b829c10 req-aa29d31d-efd1-4c1a-b58c-00c2ea3ca7c1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Updating instance_info_cache with network_info: [{"id": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "address": "fa:16:3e:cb:37:30", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa14c1ec1-db", "ovs_interfaceid": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:38 compute-1 nova_compute[183083]: 2026-01-26 08:48:38.965 183087 DEBUG oslo_concurrency.lockutils [req-80cbfea0-5560-44a5-9faf-c8b81b829c10 req-aa29d31d-efd1-4c1a-b58c-00c2ea3ca7c1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-456bc5cb-4197-47ac-b895-7d4e8b3b5e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:48:39 compute-1 nova_compute[183083]: 2026-01-26 08:48:39.628 183087 DEBUG nova.network.neutron [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:39 compute-1 nova_compute[183083]: 2026-01-26 08:48:39.648 183087 INFO nova.compute.manager [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: a487e350-10fe-48b6-bab4-2943040cfe31] Took 5.00 seconds to deallocate network for instance.
Jan 26 08:48:39 compute-1 nova_compute[183083]: 2026-01-26 08:48:39.811 183087 INFO nova.scheduler.client.report [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Deleted allocations for instance a487e350-10fe-48b6-bab4-2943040cfe31
Jan 26 08:48:39 compute-1 nova_compute[183083]: 2026-01-26 08:48:39.812 183087 DEBUG oslo_concurrency.lockutils [None req-796199f4-c16c-468c-a4d7-91f8b8558666 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "a487e350-10fe-48b6-bab4-2943040cfe31" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.406s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:40 compute-1 nova_compute[183083]: 2026-01-26 08:48:40.209 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:40 compute-1 ovn_controller[95352]: 2026-01-26T08:48:40Z|00153|binding|INFO|Releasing lport 2e241887-a928-4a33-98c2-6228ee06108e from this chassis (sb_readonly=0)
Jan 26 08:48:40 compute-1 nova_compute[183083]: 2026-01-26 08:48:40.907 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.061 183087 DEBUG oslo_concurrency.lockutils [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.061 183087 DEBUG oslo_concurrency.lockutils [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.061 183087 DEBUG oslo_concurrency.lockutils [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.062 183087 DEBUG oslo_concurrency.lockutils [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.062 183087 DEBUG oslo_concurrency.lockutils [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.063 183087 INFO nova.compute.manager [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Terminating instance
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.064 183087 DEBUG nova.compute.manager [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:48:41 compute-1 kernel: tapa14c1ec1-db (unregistering): left promiscuous mode
Jan 26 08:48:41 compute-1 NetworkManager[55451]: <info>  [1769417321.0931] device (tapa14c1ec1-db): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.108 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:41 compute-1 ovn_controller[95352]: 2026-01-26T08:48:41Z|00154|binding|INFO|Releasing lport a14c1ec1-db66-4519-83e7-c1d8ebc20851 from this chassis (sb_readonly=0)
Jan 26 08:48:41 compute-1 ovn_controller[95352]: 2026-01-26T08:48:41Z|00155|binding|INFO|Setting lport a14c1ec1-db66-4519-83e7-c1d8ebc20851 down in Southbound
Jan 26 08:48:41 compute-1 ovn_controller[95352]: 2026-01-26T08:48:41Z|00156|binding|INFO|Removing iface tapa14c1ec1-db ovn-installed in OVS
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.113 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:41.119 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:37:30 10.100.0.5', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '456bc5cb-4197-47ac-b895-7d4e8b3b5e58', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9006b908-3439-4b0e-b89f-6a6dbb60f4a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4a559c36b13649d98b2995c099340eb9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.215'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5229e7d1-fe13-4532-bccb-d60478a3e25e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=a14c1ec1-db66-4519-83e7-c1d8ebc20851) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:48:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:41.121 104632 INFO neutron.agent.ovn.metadata.agent [-] Port a14c1ec1-db66-4519-83e7-c1d8ebc20851 in datapath 9006b908-3439-4b0e-b89f-6a6dbb60f4a7 unbound from our chassis
Jan 26 08:48:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:41.123 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9006b908-3439-4b0e-b89f-6a6dbb60f4a7
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.128 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:41.138 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[4f53b347-cae2-4cd5-a717-cd6ef76a8da8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:41 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000020.scope: Deactivated successfully.
Jan 26 08:48:41 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000020.scope: Consumed 11.989s CPU time.
Jan 26 08:48:41 compute-1 systemd-machined[154360]: Machine qemu-8-instance-00000020 terminated.
Jan 26 08:48:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:41.164 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[d904aede-a394-4f43-b542-7de7a98df502]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:41.167 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[ed32726c-64b7-4acc-a906-c51a18f7522a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:41.194 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[d4aaad47-88dd-4871-8a7c-84a36632d657]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:41.211 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[162e5337-4cb3-4c36-a858-63b62b970156]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9006b908-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:9a:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 21, 'tx_packets': 7, 'rx_bytes': 1330, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 21, 'tx_packets': 7, 'rx_bytes': 1330, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 359477, 'reachable_time': 19080, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 12, 'inoctets': 784, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 12, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 784, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 12, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215269, 'error': None, 'target': 'ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:41.225 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[ebfe4b50-92fa-477e-9553-fd7e80d03da1]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9006b908-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 359493, 'tstamp': 359493}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215270, 'error': None, 'target': 'ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9006b908-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 359497, 'tstamp': 359497}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215270, 'error': None, 'target': 'ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:41.227 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9006b908-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.229 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.232 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:41.233 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9006b908-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:41.233 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:48:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:41.233 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9006b908-30, col_values=(('external_ids', {'iface-id': '2e241887-a928-4a33-98c2-6228ee06108e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:41.234 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:48:41 compute-1 ovn_controller[95352]: 2026-01-26T08:48:41Z|00157|binding|INFO|Releasing lport 2e241887-a928-4a33-98c2-6228ee06108e from this chassis (sb_readonly=0)
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.359 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.385 183087 INFO nova.virt.libvirt.driver [-] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Instance destroyed successfully.
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.385 183087 DEBUG nova.objects.instance [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lazy-loading 'resources' on Instance uuid 456bc5cb-4197-47ac-b895-7d4e8b3b5e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.404 183087 DEBUG nova.virt.libvirt.vif [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T08:48:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1717515504',display_name='tempest-server-test-1717515504',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1717515504',id=32,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHR6EJ0M3L9T9j8F+dLuTeoyDdnWZCi8ic8NnDlm++GcyV15uFS8BCYLqsqkAatnDxNEdqPcffpohwDfIhW8BevW7ZqTxZGVxsZmagZkz7C/Tn9HOibVv/vjkHnqrvkfkA==',key_name='tempest-keypair-test-713690759',keypairs=<?>,launch_index=0,launched_at=2026-01-26T08:48:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4a559c36b13649d98b2995c099340eb9',ramdisk_id='',reservation_id='r-lc7rdn7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-PortSecurityTest-508365101',owner_user_name='tempest-PortSecurityTest-508365101-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T08:48:23Z,user_data=None,user_id='52d582094c584036ba3e04c9da69ee02',uuid=456bc5cb-4197-47ac-b895-7d4e8b3b5e58,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "address": "fa:16:3e:cb:37:30", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa14c1ec1-db", "ovs_interfaceid": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.404 183087 DEBUG nova.network.os_vif_util [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converting VIF {"id": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "address": "fa:16:3e:cb:37:30", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa14c1ec1-db", "ovs_interfaceid": "a14c1ec1-db66-4519-83e7-c1d8ebc20851", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.404 183087 DEBUG nova.network.os_vif_util [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cb:37:30,bridge_name='br-int',has_traffic_filtering=True,id=a14c1ec1-db66-4519-83e7-c1d8ebc20851,network=Network(9006b908-3439-4b0e-b89f-6a6dbb60f4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa14c1ec1-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.405 183087 DEBUG os_vif [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:37:30,bridge_name='br-int',has_traffic_filtering=True,id=a14c1ec1-db66-4519-83e7-c1d8ebc20851,network=Network(9006b908-3439-4b0e-b89f-6a6dbb60f4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa14c1ec1-db') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.407 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.407 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa14c1ec1-db, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.408 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.410 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.413 183087 INFO os_vif [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:37:30,bridge_name='br-int',has_traffic_filtering=True,id=a14c1ec1-db66-4519-83e7-c1d8ebc20851,network=Network(9006b908-3439-4b0e-b89f-6a6dbb60f4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa14c1ec1-db')
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.413 183087 INFO nova.virt.libvirt.driver [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Deleting instance files /var/lib/nova/instances/456bc5cb-4197-47ac-b895-7d4e8b3b5e58_del
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.414 183087 INFO nova.virt.libvirt.driver [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Deletion of /var/lib/nova/instances/456bc5cb-4197-47ac-b895-7d4e8b3b5e58_del complete
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.476 183087 INFO nova.compute.manager [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Took 0.41 seconds to destroy the instance on the hypervisor.
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.477 183087 DEBUG oslo.service.loopingcall [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.477 183087 DEBUG nova.compute.manager [-] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.477 183087 DEBUG nova.network.neutron [-] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.563 183087 DEBUG nova.compute.manager [req-1499ff71-6896-4124-9138-ed970343aed8 req-b2dc2e5a-72a6-4e50-8814-9ccbbf14b197 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Received event network-vif-unplugged-a14c1ec1-db66-4519-83e7-c1d8ebc20851 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.564 183087 DEBUG oslo_concurrency.lockutils [req-1499ff71-6896-4124-9138-ed970343aed8 req-b2dc2e5a-72a6-4e50-8814-9ccbbf14b197 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.565 183087 DEBUG oslo_concurrency.lockutils [req-1499ff71-6896-4124-9138-ed970343aed8 req-b2dc2e5a-72a6-4e50-8814-9ccbbf14b197 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.565 183087 DEBUG oslo_concurrency.lockutils [req-1499ff71-6896-4124-9138-ed970343aed8 req-b2dc2e5a-72a6-4e50-8814-9ccbbf14b197 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.566 183087 DEBUG nova.compute.manager [req-1499ff71-6896-4124-9138-ed970343aed8 req-b2dc2e5a-72a6-4e50-8814-9ccbbf14b197 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] No waiting events found dispatching network-vif-unplugged-a14c1ec1-db66-4519-83e7-c1d8ebc20851 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:48:41 compute-1 nova_compute[183083]: 2026-01-26 08:48:41.566 183087 DEBUG nova.compute.manager [req-1499ff71-6896-4124-9138-ed970343aed8 req-b2dc2e5a-72a6-4e50-8814-9ccbbf14b197 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Received event network-vif-unplugged-a14c1ec1-db66-4519-83e7-c1d8ebc20851 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 08:48:43 compute-1 nova_compute[183083]: 2026-01-26 08:48:43.657 183087 DEBUG nova.compute.manager [req-4e6c1249-c592-4ebe-9277-adf3bf246dda req-46fd83a9-b5a3-4b34-a2bb-97709e62798e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Received event network-vif-plugged-a14c1ec1-db66-4519-83e7-c1d8ebc20851 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:48:43 compute-1 nova_compute[183083]: 2026-01-26 08:48:43.658 183087 DEBUG oslo_concurrency.lockutils [req-4e6c1249-c592-4ebe-9277-adf3bf246dda req-46fd83a9-b5a3-4b34-a2bb-97709e62798e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:43 compute-1 nova_compute[183083]: 2026-01-26 08:48:43.659 183087 DEBUG oslo_concurrency.lockutils [req-4e6c1249-c592-4ebe-9277-adf3bf246dda req-46fd83a9-b5a3-4b34-a2bb-97709e62798e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:43 compute-1 nova_compute[183083]: 2026-01-26 08:48:43.659 183087 DEBUG oslo_concurrency.lockutils [req-4e6c1249-c592-4ebe-9277-adf3bf246dda req-46fd83a9-b5a3-4b34-a2bb-97709e62798e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:43 compute-1 nova_compute[183083]: 2026-01-26 08:48:43.660 183087 DEBUG nova.compute.manager [req-4e6c1249-c592-4ebe-9277-adf3bf246dda req-46fd83a9-b5a3-4b34-a2bb-97709e62798e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] No waiting events found dispatching network-vif-plugged-a14c1ec1-db66-4519-83e7-c1d8ebc20851 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:48:43 compute-1 nova_compute[183083]: 2026-01-26 08:48:43.660 183087 WARNING nova.compute.manager [req-4e6c1249-c592-4ebe-9277-adf3bf246dda req-46fd83a9-b5a3-4b34-a2bb-97709e62798e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Received unexpected event network-vif-plugged-a14c1ec1-db66-4519-83e7-c1d8ebc20851 for instance with vm_state active and task_state deleting.
Jan 26 08:48:44 compute-1 nova_compute[183083]: 2026-01-26 08:48:44.054 183087 DEBUG nova.network.neutron [-] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:44 compute-1 nova_compute[183083]: 2026-01-26 08:48:44.081 183087 INFO nova.compute.manager [-] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Took 2.60 seconds to deallocate network for instance.
Jan 26 08:48:44 compute-1 nova_compute[183083]: 2026-01-26 08:48:44.140 183087 DEBUG oslo_concurrency.lockutils [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:44 compute-1 nova_compute[183083]: 2026-01-26 08:48:44.140 183087 DEBUG oslo_concurrency.lockutils [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:44 compute-1 nova_compute[183083]: 2026-01-26 08:48:44.234 183087 DEBUG nova.compute.provider_tree [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:48:44 compute-1 nova_compute[183083]: 2026-01-26 08:48:44.250 183087 DEBUG nova.scheduler.client.report [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:48:44 compute-1 nova_compute[183083]: 2026-01-26 08:48:44.275 183087 DEBUG oslo_concurrency.lockutils [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:44 compute-1 nova_compute[183083]: 2026-01-26 08:48:44.303 183087 INFO nova.scheduler.client.report [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Deleted allocations for instance 456bc5cb-4197-47ac-b895-7d4e8b3b5e58
Jan 26 08:48:44 compute-1 nova_compute[183083]: 2026-01-26 08:48:44.380 183087 DEBUG oslo_concurrency.lockutils [None req-764eb279-499f-4a05-a31f-dc9af9709c75 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "456bc5cb-4197-47ac-b895-7d4e8b3b5e58" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.319s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:45 compute-1 sshd-session[215286]: Connection closed by authenticating user root 159.223.236.81 port 53970 [preauth]
Jan 26 08:48:45 compute-1 nova_compute[183083]: 2026-01-26 08:48:45.212 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:45 compute-1 nova_compute[183083]: 2026-01-26 08:48:45.752 183087 DEBUG oslo_concurrency.lockutils [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:45 compute-1 nova_compute[183083]: 2026-01-26 08:48:45.753 183087 DEBUG oslo_concurrency.lockutils [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:45 compute-1 nova_compute[183083]: 2026-01-26 08:48:45.753 183087 DEBUG oslo_concurrency.lockutils [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:45 compute-1 nova_compute[183083]: 2026-01-26 08:48:45.754 183087 DEBUG oslo_concurrency.lockutils [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:45 compute-1 nova_compute[183083]: 2026-01-26 08:48:45.754 183087 DEBUG oslo_concurrency.lockutils [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:45 compute-1 nova_compute[183083]: 2026-01-26 08:48:45.756 183087 INFO nova.compute.manager [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Terminating instance
Jan 26 08:48:45 compute-1 nova_compute[183083]: 2026-01-26 08:48:45.757 183087 DEBUG nova.compute.manager [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:48:45 compute-1 kernel: tap09872a2c-36 (unregistering): left promiscuous mode
Jan 26 08:48:45 compute-1 NetworkManager[55451]: <info>  [1769417325.7905] device (tap09872a2c-36): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 08:48:45 compute-1 ovn_controller[95352]: 2026-01-26T08:48:45Z|00158|binding|INFO|Releasing lport 09872a2c-368e-4dca-93d8-5fdd642b03b3 from this chassis (sb_readonly=0)
Jan 26 08:48:45 compute-1 nova_compute[183083]: 2026-01-26 08:48:45.795 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:45 compute-1 ovn_controller[95352]: 2026-01-26T08:48:45Z|00159|binding|INFO|Setting lport 09872a2c-368e-4dca-93d8-5fdd642b03b3 down in Southbound
Jan 26 08:48:45 compute-1 ovn_controller[95352]: 2026-01-26T08:48:45Z|00160|binding|INFO|Removing iface tap09872a2c-36 ovn-installed in OVS
Jan 26 08:48:45 compute-1 nova_compute[183083]: 2026-01-26 08:48:45.797 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:45 compute-1 nova_compute[183083]: 2026-01-26 08:48:45.813 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:45 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:45.813 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:46:e8 10.100.0.6', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9006b908-3439-4b0e-b89f-6a6dbb60f4a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4a559c36b13649d98b2995c099340eb9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5229e7d1-fe13-4532-bccb-d60478a3e25e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=09872a2c-368e-4dca-93d8-5fdd642b03b3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:48:45 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:45.816 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 09872a2c-368e-4dca-93d8-5fdd642b03b3 in datapath 9006b908-3439-4b0e-b89f-6a6dbb60f4a7 unbound from our chassis
Jan 26 08:48:45 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:45.819 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9006b908-3439-4b0e-b89f-6a6dbb60f4a7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 08:48:45 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:45.821 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[7f212319-6af0-4bf7-8c6e-2f32d94c1205]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:45 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:45.822 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7 namespace which is not needed anymore
Jan 26 08:48:45 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000019.scope: Deactivated successfully.
Jan 26 08:48:45 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000019.scope: Consumed 14.568s CPU time.
Jan 26 08:48:45 compute-1 systemd-machined[154360]: Machine qemu-7-instance-00000019 terminated.
Jan 26 08:48:45 compute-1 neutron-haproxy-ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7[214727]: [NOTICE]   (214731) : haproxy version is 2.8.14-c23fe91
Jan 26 08:48:45 compute-1 neutron-haproxy-ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7[214727]: [NOTICE]   (214731) : path to executable is /usr/sbin/haproxy
Jan 26 08:48:45 compute-1 neutron-haproxy-ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7[214727]: [WARNING]  (214731) : Exiting Master process...
Jan 26 08:48:45 compute-1 neutron-haproxy-ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7[214727]: [ALERT]    (214731) : Current worker (214733) exited with code 143 (Terminated)
Jan 26 08:48:45 compute-1 neutron-haproxy-ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7[214727]: [WARNING]  (214731) : All workers exited. Exiting... (0)
Jan 26 08:48:45 compute-1 systemd[1]: libpod-f2164f63ec818bde1faf58d87145d12dbde02e77399200edf955d590f4a15941.scope: Deactivated successfully.
Jan 26 08:48:45 compute-1 podman[215313]: 2026-01-26 08:48:45.968944587 +0000 UTC m=+0.049570837 container died f2164f63ec818bde1faf58d87145d12dbde02e77399200edf955d590f4a15941 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 26 08:48:45 compute-1 kernel: tap09872a2c-36: entered promiscuous mode
Jan 26 08:48:46 compute-1 ovn_controller[95352]: 2026-01-26T08:48:46Z|00161|binding|INFO|Claiming lport 09872a2c-368e-4dca-93d8-5fdd642b03b3 for this chassis.
Jan 26 08:48:46 compute-1 kernel: tap09872a2c-36 (unregistering): left promiscuous mode
Jan 26 08:48:46 compute-1 ovn_controller[95352]: 2026-01-26T08:48:46Z|00162|binding|INFO|09872a2c-368e-4dca-93d8-5fdd642b03b3: Claiming fa:16:3e:90:46:e8 10.100.0.6
Jan 26 08:48:46 compute-1 ovn_controller[95352]: 2026-01-26T08:48:46Z|00163|binding|INFO|09872a2c-368e-4dca-93d8-5fdd642b03b3: Claiming unknown
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.017 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:46 compute-1 NetworkManager[55451]: <info>  [1769417326.0255] manager: (tap09872a2c-36): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Jan 26 08:48:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:46.027 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:46:e8 10.100.0.6', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9006b908-3439-4b0e-b89f-6a6dbb60f4a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4a559c36b13649d98b2995c099340eb9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5229e7d1-fe13-4532-bccb-d60478a3e25e, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=09872a2c-368e-4dca-93d8-5fdd642b03b3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:48:46 compute-1 ovn_controller[95352]: 2026-01-26T08:48:46Z|00164|binding|INFO|Setting lport 09872a2c-368e-4dca-93d8-5fdd642b03b3 ovn-installed in OVS
Jan 26 08:48:46 compute-1 ovn_controller[95352]: 2026-01-26T08:48:46Z|00165|binding|INFO|Setting lport 09872a2c-368e-4dca-93d8-5fdd642b03b3 up in Southbound
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.039 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:46 compute-1 ovn_controller[95352]: 2026-01-26T08:48:46Z|00166|binding|INFO|Releasing lport 09872a2c-368e-4dca-93d8-5fdd642b03b3 from this chassis (sb_readonly=1)
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.044 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:46 compute-1 ovn_controller[95352]: 2026-01-26T08:48:46Z|00167|binding|INFO|Removing iface tap09872a2c-36 ovn-installed in OVS
Jan 26 08:48:46 compute-1 ovn_controller[95352]: 2026-01-26T08:48:46Z|00168|if_status|INFO|Not setting lport 09872a2c-368e-4dca-93d8-5fdd642b03b3 down as sb is readonly
Jan 26 08:48:46 compute-1 ovn_controller[95352]: 2026-01-26T08:48:46Z|00169|binding|INFO|Releasing lport 09872a2c-368e-4dca-93d8-5fdd642b03b3 from this chassis (sb_readonly=0)
Jan 26 08:48:46 compute-1 ovn_controller[95352]: 2026-01-26T08:48:46Z|00170|binding|INFO|Setting lport 09872a2c-368e-4dca-93d8-5fdd642b03b3 down in Southbound
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.047 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:46 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f2164f63ec818bde1faf58d87145d12dbde02e77399200edf955d590f4a15941-userdata-shm.mount: Deactivated successfully.
Jan 26 08:48:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:46.054 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:46:e8 10.100.0.6', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'addae953-8eb4-46ed-959d-3c2bb6b31ee3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9006b908-3439-4b0e-b89f-6a6dbb60f4a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4a559c36b13649d98b2995c099340eb9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5229e7d1-fe13-4532-bccb-d60478a3e25e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=09872a2c-368e-4dca-93d8-5fdd642b03b3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.057 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:46 compute-1 systemd[1]: var-lib-containers-storage-overlay-772ff4d0575ed08b431aa3de405bf114bfe559eaebd4b0cd36d0dbc19b43a528-merged.mount: Deactivated successfully.
Jan 26 08:48:46 compute-1 podman[215313]: 2026-01-26 08:48:46.064253212 +0000 UTC m=+0.144879462 container cleanup f2164f63ec818bde1faf58d87145d12dbde02e77399200edf955d590f4a15941 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.070 183087 INFO nova.virt.libvirt.driver [-] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Instance destroyed successfully.
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.070 183087 DEBUG nova.objects.instance [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lazy-loading 'resources' on Instance uuid addae953-8eb4-46ed-959d-3c2bb6b31ee3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:48:46 compute-1 systemd[1]: libpod-conmon-f2164f63ec818bde1faf58d87145d12dbde02e77399200edf955d590f4a15941.scope: Deactivated successfully.
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.082 183087 DEBUG nova.virt.libvirt.vif [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T08:47:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1111872342',display_name='tempest-server-test-1111872342',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1111872342',id=25,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHR6EJ0M3L9T9j8F+dLuTeoyDdnWZCi8ic8NnDlm++GcyV15uFS8BCYLqsqkAatnDxNEdqPcffpohwDfIhW8BevW7ZqTxZGVxsZmagZkz7C/Tn9HOibVv/vjkHnqrvkfkA==',key_name='tempest-keypair-test-713690759',keypairs=<?>,launch_index=0,launched_at=2026-01-26T08:47:47Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4a559c36b13649d98b2995c099340eb9',ramdisk_id='',reservation_id='r-e1bdduph',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-PortSecurityTest-508365101',owner_user_name='tempest-PortSecurityTest-508365101-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T08:47:47Z,user_data=None,user_id='52d582094c584036ba3e04c9da69ee02',uuid=addae953-8eb4-46ed-959d-3c2bb6b31ee3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "address": "fa:16:3e:90:46:e8", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09872a2c-36", "ovs_interfaceid": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.082 183087 DEBUG nova.network.os_vif_util [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converting VIF {"id": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "address": "fa:16:3e:90:46:e8", "network": {"id": "9006b908-3439-4b0e-b89f-6a6dbb60f4a7", "bridge": "br-int", "label": "tempest-test-network--519498246", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4a559c36b13649d98b2995c099340eb9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09872a2c-36", "ovs_interfaceid": "09872a2c-368e-4dca-93d8-5fdd642b03b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.083 183087 DEBUG nova.network.os_vif_util [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:90:46:e8,bridge_name='br-int',has_traffic_filtering=True,id=09872a2c-368e-4dca-93d8-5fdd642b03b3,network=Network(9006b908-3439-4b0e-b89f-6a6dbb60f4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap09872a2c-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.083 183087 DEBUG os_vif [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:90:46:e8,bridge_name='br-int',has_traffic_filtering=True,id=09872a2c-368e-4dca-93d8-5fdd642b03b3,network=Network(9006b908-3439-4b0e-b89f-6a6dbb60f4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap09872a2c-36') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.085 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.085 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09872a2c-36, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.086 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.087 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.089 183087 INFO os_vif [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:90:46:e8,bridge_name='br-int',has_traffic_filtering=True,id=09872a2c-368e-4dca-93d8-5fdd642b03b3,network=Network(9006b908-3439-4b0e-b89f-6a6dbb60f4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap09872a2c-36')
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.090 183087 INFO nova.virt.libvirt.driver [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Deleting instance files /var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3_del
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.090 183087 INFO nova.virt.libvirt.driver [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Deletion of /var/lib/nova/instances/addae953-8eb4-46ed-959d-3c2bb6b31ee3_del complete
Jan 26 08:48:46 compute-1 podman[215354]: 2026-01-26 08:48:46.134478454 +0000 UTC m=+0.047287943 container remove f2164f63ec818bde1faf58d87145d12dbde02e77399200edf955d590f4a15941 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 26 08:48:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:46.143 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[ea678bf4-c59d-40b6-978a-0ca4b4d67874]: (4, ('Mon Jan 26 08:48:45 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7 (f2164f63ec818bde1faf58d87145d12dbde02e77399200edf955d590f4a15941)\nf2164f63ec818bde1faf58d87145d12dbde02e77399200edf955d590f4a15941\nMon Jan 26 08:48:46 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7 (f2164f63ec818bde1faf58d87145d12dbde02e77399200edf955d590f4a15941)\nf2164f63ec818bde1faf58d87145d12dbde02e77399200edf955d590f4a15941\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:46.146 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[db1ef66f-08a7-4aca-968d-fc9b4376557c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.147 183087 INFO nova.compute.manager [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 26 08:48:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:46.147 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9006b908-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.147 183087 DEBUG oslo.service.loopingcall [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.148 183087 DEBUG nova.compute.manager [-] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.148 183087 DEBUG nova.network.neutron [-] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.149 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:46 compute-1 kernel: tap9006b908-30: left promiscuous mode
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.175 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:46.178 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[d702e4c6-755b-4fc2-9c2b-44ca234fcf0c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:46.194 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[08488d98-9fa7-44a1-86ad-a5261a59c61d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:46.195 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[4001e789-88fc-4933-8b32-a4209e64e4c5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:46.216 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[9d89fcf8-faf5-40fd-8f3b-d8133fba0337]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 359467, 'reachable_time': 21763, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215369, 'error': None, 'target': 'ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:46.218 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9006b908-3439-4b0e-b89f-6a6dbb60f4a7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 08:48:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:46.218 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[93fdf15a-a7d2-443e-a95c-fbc59aa05aeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:46.219 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 09872a2c-368e-4dca-93d8-5fdd642b03b3 in datapath 9006b908-3439-4b0e-b89f-6a6dbb60f4a7 unbound from our chassis
Jan 26 08:48:46 compute-1 systemd[1]: run-netns-ovnmeta\x2d9006b908\x2d3439\x2d4b0e\x2db89f\x2d6a6dbb60f4a7.mount: Deactivated successfully.
Jan 26 08:48:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:46.222 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9006b908-3439-4b0e-b89f-6a6dbb60f4a7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 08:48:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:46.223 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b36c21ad-2964-4892-a083-491f4879b68a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:46.224 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 09872a2c-368e-4dca-93d8-5fdd642b03b3 in datapath 9006b908-3439-4b0e-b89f-6a6dbb60f4a7 unbound from our chassis
Jan 26 08:48:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:46.227 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9006b908-3439-4b0e-b89f-6a6dbb60f4a7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 08:48:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:48:46.227 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[9ff335ab-5be4-4048-8b69-8638e11a1f6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.374 183087 DEBUG nova.compute.manager [req-087afe78-76a9-4924-8975-a422859d1908 req-17c4c194-723d-4783-9efa-350a2f62495d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Received event network-vif-unplugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.375 183087 DEBUG oslo_concurrency.lockutils [req-087afe78-76a9-4924-8975-a422859d1908 req-17c4c194-723d-4783-9efa-350a2f62495d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.375 183087 DEBUG oslo_concurrency.lockutils [req-087afe78-76a9-4924-8975-a422859d1908 req-17c4c194-723d-4783-9efa-350a2f62495d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.376 183087 DEBUG oslo_concurrency.lockutils [req-087afe78-76a9-4924-8975-a422859d1908 req-17c4c194-723d-4783-9efa-350a2f62495d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.376 183087 DEBUG nova.compute.manager [req-087afe78-76a9-4924-8975-a422859d1908 req-17c4c194-723d-4783-9efa-350a2f62495d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] No waiting events found dispatching network-vif-unplugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:48:46 compute-1 nova_compute[183083]: 2026-01-26 08:48:46.377 183087 DEBUG nova.compute.manager [req-087afe78-76a9-4924-8975-a422859d1908 req-17c4c194-723d-4783-9efa-350a2f62495d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Received event network-vif-unplugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 08:48:48 compute-1 nova_compute[183083]: 2026-01-26 08:48:48.505 183087 DEBUG nova.compute.manager [req-09a6c1b6-aeaf-40b6-b6dc-c0a8c7d9dd0a req-47d2be10-610a-4eb0-8a31-153d32eab6bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Received event network-vif-plugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:48:48 compute-1 nova_compute[183083]: 2026-01-26 08:48:48.505 183087 DEBUG oslo_concurrency.lockutils [req-09a6c1b6-aeaf-40b6-b6dc-c0a8c7d9dd0a req-47d2be10-610a-4eb0-8a31-153d32eab6bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:48 compute-1 nova_compute[183083]: 2026-01-26 08:48:48.506 183087 DEBUG oslo_concurrency.lockutils [req-09a6c1b6-aeaf-40b6-b6dc-c0a8c7d9dd0a req-47d2be10-610a-4eb0-8a31-153d32eab6bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:48 compute-1 nova_compute[183083]: 2026-01-26 08:48:48.506 183087 DEBUG oslo_concurrency.lockutils [req-09a6c1b6-aeaf-40b6-b6dc-c0a8c7d9dd0a req-47d2be10-610a-4eb0-8a31-153d32eab6bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:48 compute-1 nova_compute[183083]: 2026-01-26 08:48:48.507 183087 DEBUG nova.compute.manager [req-09a6c1b6-aeaf-40b6-b6dc-c0a8c7d9dd0a req-47d2be10-610a-4eb0-8a31-153d32eab6bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] No waiting events found dispatching network-vif-plugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:48:48 compute-1 nova_compute[183083]: 2026-01-26 08:48:48.507 183087 WARNING nova.compute.manager [req-09a6c1b6-aeaf-40b6-b6dc-c0a8c7d9dd0a req-47d2be10-610a-4eb0-8a31-153d32eab6bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Received unexpected event network-vif-plugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 for instance with vm_state active and task_state deleting.
Jan 26 08:48:48 compute-1 nova_compute[183083]: 2026-01-26 08:48:48.508 183087 DEBUG nova.compute.manager [req-09a6c1b6-aeaf-40b6-b6dc-c0a8c7d9dd0a req-47d2be10-610a-4eb0-8a31-153d32eab6bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Received event network-vif-plugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:48:48 compute-1 nova_compute[183083]: 2026-01-26 08:48:48.508 183087 DEBUG oslo_concurrency.lockutils [req-09a6c1b6-aeaf-40b6-b6dc-c0a8c7d9dd0a req-47d2be10-610a-4eb0-8a31-153d32eab6bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:48 compute-1 nova_compute[183083]: 2026-01-26 08:48:48.508 183087 DEBUG oslo_concurrency.lockutils [req-09a6c1b6-aeaf-40b6-b6dc-c0a8c7d9dd0a req-47d2be10-610a-4eb0-8a31-153d32eab6bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:48 compute-1 nova_compute[183083]: 2026-01-26 08:48:48.509 183087 DEBUG oslo_concurrency.lockutils [req-09a6c1b6-aeaf-40b6-b6dc-c0a8c7d9dd0a req-47d2be10-610a-4eb0-8a31-153d32eab6bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:48 compute-1 nova_compute[183083]: 2026-01-26 08:48:48.509 183087 DEBUG nova.compute.manager [req-09a6c1b6-aeaf-40b6-b6dc-c0a8c7d9dd0a req-47d2be10-610a-4eb0-8a31-153d32eab6bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] No waiting events found dispatching network-vif-plugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:48:48 compute-1 nova_compute[183083]: 2026-01-26 08:48:48.510 183087 WARNING nova.compute.manager [req-09a6c1b6-aeaf-40b6-b6dc-c0a8c7d9dd0a req-47d2be10-610a-4eb0-8a31-153d32eab6bc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Received unexpected event network-vif-plugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 for instance with vm_state active and task_state deleting.
Jan 26 08:48:50 compute-1 nova_compute[183083]: 2026-01-26 08:48:50.236 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:50 compute-1 nova_compute[183083]: 2026-01-26 08:48:50.833 183087 DEBUG nova.network.neutron [-] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:48:50 compute-1 nova_compute[183083]: 2026-01-26 08:48:50.875 183087 INFO nova.compute.manager [-] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Took 4.73 seconds to deallocate network for instance.
Jan 26 08:48:50 compute-1 nova_compute[183083]: 2026-01-26 08:48:50.924 183087 DEBUG nova.compute.manager [req-535bf8ae-c7b3-4280-8d19-434489c54ede req-d557b570-9e60-473e-904c-888af477ecff 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Received event network-vif-plugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:48:50 compute-1 nova_compute[183083]: 2026-01-26 08:48:50.925 183087 DEBUG oslo_concurrency.lockutils [req-535bf8ae-c7b3-4280-8d19-434489c54ede req-d557b570-9e60-473e-904c-888af477ecff 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:50 compute-1 nova_compute[183083]: 2026-01-26 08:48:50.926 183087 DEBUG oslo_concurrency.lockutils [req-535bf8ae-c7b3-4280-8d19-434489c54ede req-d557b570-9e60-473e-904c-888af477ecff 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:50 compute-1 nova_compute[183083]: 2026-01-26 08:48:50.927 183087 DEBUG oslo_concurrency.lockutils [req-535bf8ae-c7b3-4280-8d19-434489c54ede req-d557b570-9e60-473e-904c-888af477ecff 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:50 compute-1 nova_compute[183083]: 2026-01-26 08:48:50.928 183087 DEBUG nova.compute.manager [req-535bf8ae-c7b3-4280-8d19-434489c54ede req-d557b570-9e60-473e-904c-888af477ecff 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] No waiting events found dispatching network-vif-plugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:48:50 compute-1 nova_compute[183083]: 2026-01-26 08:48:50.928 183087 WARNING nova.compute.manager [req-535bf8ae-c7b3-4280-8d19-434489c54ede req-d557b570-9e60-473e-904c-888af477ecff 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Received unexpected event network-vif-plugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 for instance with vm_state active and task_state deleting.
Jan 26 08:48:50 compute-1 nova_compute[183083]: 2026-01-26 08:48:50.949 183087 DEBUG oslo_concurrency.lockutils [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:50 compute-1 nova_compute[183083]: 2026-01-26 08:48:50.949 183087 DEBUG oslo_concurrency.lockutils [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:51 compute-1 nova_compute[183083]: 2026-01-26 08:48:51.035 183087 DEBUG nova.compute.provider_tree [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:48:51 compute-1 nova_compute[183083]: 2026-01-26 08:48:51.056 183087 DEBUG nova.scheduler.client.report [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:48:51 compute-1 nova_compute[183083]: 2026-01-26 08:48:51.081 183087 DEBUG oslo_concurrency.lockutils [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:51 compute-1 nova_compute[183083]: 2026-01-26 08:48:51.088 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:51 compute-1 nova_compute[183083]: 2026-01-26 08:48:51.113 183087 INFO nova.scheduler.client.report [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Deleted allocations for instance addae953-8eb4-46ed-959d-3c2bb6b31ee3
Jan 26 08:48:51 compute-1 nova_compute[183083]: 2026-01-26 08:48:51.194 183087 DEBUG oslo_concurrency.lockutils [None req-9f731e58-61a7-4829-95bb-3a52d8b24c12 52d582094c584036ba3e04c9da69ee02 4a559c36b13649d98b2995c099340eb9 - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.442s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:51 compute-1 podman[215370]: 2026-01-26 08:48:51.846549196 +0000 UTC m=+0.097805326 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 08:48:51 compute-1 podman[215371]: 2026-01-26 08:48:51.858304429 +0000 UTC m=+0.104611649 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 26 08:48:53 compute-1 nova_compute[183083]: 2026-01-26 08:48:53.117 183087 DEBUG nova.compute.manager [req-70d394d8-7cd6-46bc-ba5b-3765c1ae72ed req-30676924-632e-4259-b731-a3edf244e1c9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Received event network-vif-plugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:48:53 compute-1 nova_compute[183083]: 2026-01-26 08:48:53.118 183087 DEBUG oslo_concurrency.lockutils [req-70d394d8-7cd6-46bc-ba5b-3765c1ae72ed req-30676924-632e-4259-b731-a3edf244e1c9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:53 compute-1 nova_compute[183083]: 2026-01-26 08:48:53.119 183087 DEBUG oslo_concurrency.lockutils [req-70d394d8-7cd6-46bc-ba5b-3765c1ae72ed req-30676924-632e-4259-b731-a3edf244e1c9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:53 compute-1 nova_compute[183083]: 2026-01-26 08:48:53.119 183087 DEBUG oslo_concurrency.lockutils [req-70d394d8-7cd6-46bc-ba5b-3765c1ae72ed req-30676924-632e-4259-b731-a3edf244e1c9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "addae953-8eb4-46ed-959d-3c2bb6b31ee3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:53 compute-1 nova_compute[183083]: 2026-01-26 08:48:53.120 183087 DEBUG nova.compute.manager [req-70d394d8-7cd6-46bc-ba5b-3765c1ae72ed req-30676924-632e-4259-b731-a3edf244e1c9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] No waiting events found dispatching network-vif-plugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:48:53 compute-1 nova_compute[183083]: 2026-01-26 08:48:53.120 183087 WARNING nova.compute.manager [req-70d394d8-7cd6-46bc-ba5b-3765c1ae72ed req-30676924-632e-4259-b731-a3edf244e1c9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Received unexpected event network-vif-plugged-09872a2c-368e-4dca-93d8-5fdd642b03b3 for instance with vm_state deleted and task_state None.
Jan 26 08:48:54 compute-1 nova_compute[183083]: 2026-01-26 08:48:54.534 183087 DEBUG oslo_concurrency.lockutils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquiring lock "92bac713-364f-4aa1-a1e7-27829b917cf0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:54 compute-1 nova_compute[183083]: 2026-01-26 08:48:54.535 183087 DEBUG oslo_concurrency.lockutils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "92bac713-364f-4aa1-a1e7-27829b917cf0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:54 compute-1 nova_compute[183083]: 2026-01-26 08:48:54.559 183087 DEBUG nova.compute.manager [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:48:54 compute-1 nova_compute[183083]: 2026-01-26 08:48:54.632 183087 DEBUG oslo_concurrency.lockutils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:54 compute-1 nova_compute[183083]: 2026-01-26 08:48:54.633 183087 DEBUG oslo_concurrency.lockutils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:54 compute-1 nova_compute[183083]: 2026-01-26 08:48:54.641 183087 DEBUG nova.virt.hardware [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:48:54 compute-1 nova_compute[183083]: 2026-01-26 08:48:54.642 183087 INFO nova.compute.claims [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:48:54 compute-1 nova_compute[183083]: 2026-01-26 08:48:54.800 183087 DEBUG nova.compute.provider_tree [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:48:54 compute-1 nova_compute[183083]: 2026-01-26 08:48:54.816 183087 DEBUG nova.scheduler.client.report [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:48:54 compute-1 nova_compute[183083]: 2026-01-26 08:48:54.849 183087 DEBUG oslo_concurrency.lockutils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:54 compute-1 nova_compute[183083]: 2026-01-26 08:48:54.850 183087 DEBUG nova.compute.manager [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:48:54 compute-1 nova_compute[183083]: 2026-01-26 08:48:54.919 183087 DEBUG nova.compute.manager [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:48:54 compute-1 nova_compute[183083]: 2026-01-26 08:48:54.920 183087 DEBUG nova.network.neutron [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:48:54 compute-1 nova_compute[183083]: 2026-01-26 08:48:54.953 183087 INFO nova.virt.libvirt.driver [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:48:55 compute-1 nova_compute[183083]: 2026-01-26 08:48:55.005 183087 DEBUG nova.compute.manager [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:48:55 compute-1 nova_compute[183083]: 2026-01-26 08:48:55.145 183087 DEBUG nova.compute.manager [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:48:55 compute-1 nova_compute[183083]: 2026-01-26 08:48:55.147 183087 DEBUG nova.virt.libvirt.driver [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:48:55 compute-1 nova_compute[183083]: 2026-01-26 08:48:55.148 183087 INFO nova.virt.libvirt.driver [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Creating image(s)
Jan 26 08:48:55 compute-1 nova_compute[183083]: 2026-01-26 08:48:55.149 183087 DEBUG oslo_concurrency.lockutils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquiring lock "/var/lib/nova/instances/92bac713-364f-4aa1-a1e7-27829b917cf0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:55 compute-1 nova_compute[183083]: 2026-01-26 08:48:55.149 183087 DEBUG oslo_concurrency.lockutils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "/var/lib/nova/instances/92bac713-364f-4aa1-a1e7-27829b917cf0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:55 compute-1 nova_compute[183083]: 2026-01-26 08:48:55.150 183087 DEBUG oslo_concurrency.lockutils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "/var/lib/nova/instances/92bac713-364f-4aa1-a1e7-27829b917cf0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:55 compute-1 nova_compute[183083]: 2026-01-26 08:48:55.150 183087 DEBUG oslo_concurrency.lockutils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:48:55 compute-1 nova_compute[183083]: 2026-01-26 08:48:55.151 183087 DEBUG oslo_concurrency.lockutils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:48:55 compute-1 nova_compute[183083]: 2026-01-26 08:48:55.238 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:56 compute-1 nova_compute[183083]: 2026-01-26 08:48:56.110 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:48:56 compute-1 nova_compute[183083]: 2026-01-26 08:48:56.384 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769417321.3835099, 456bc5cb-4197-47ac-b895-7d4e8b3b5e58 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:48:56 compute-1 nova_compute[183083]: 2026-01-26 08:48:56.385 183087 INFO nova.compute.manager [-] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] VM Stopped (Lifecycle Event)
Jan 26 08:48:56 compute-1 nova_compute[183083]: 2026-01-26 08:48:56.398 183087 DEBUG nova.policy [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '10286fb05759476d8b51274239727a28', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9680eb9addb04171b834f7dff8ec602d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:48:56 compute-1 nova_compute[183083]: 2026-01-26 08:48:56.418 183087 DEBUG nova.compute.manager [None req-8ee1615a-5bad-4ebe-83c0-d9acd498a494 - - - - - -] [instance: 456bc5cb-4197-47ac-b895-7d4e8b3b5e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:48:57 compute-1 podman[215414]: 2026-01-26 08:48:57.821907074 +0000 UTC m=+0.068050499 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 08:48:57 compute-1 podman[215413]: 2026-01-26 08:48:57.844819767 +0000 UTC m=+0.099281686 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 08:48:57 compute-1 podman[215461]: 2026-01-26 08:48:57.896710672 +0000 UTC m=+0.048566563 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.573 183087 DEBUG oslo_concurrency.lockutils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.423s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Traceback (most recent call last):
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]     raise exception.ImageUnacceptable(
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] 
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] During handling of the above exception, another exception occurred:
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] 
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Traceback (most recent call last):
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]     yield resources
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]     created_disks = self._create_and_inject_local_root(
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]     image.cache(fetch_func=fetch_func,
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]     return f(*args, **kwargs)
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0]     raise exception.ImageUnacceptable(
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:48:58 compute-1 nova_compute[183083]: 2026-01-26 08:48:58.574 183087 ERROR nova.compute.manager [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] 
Jan 26 08:49:00 compute-1 nova_compute[183083]: 2026-01-26 08:49:00.240 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:00 compute-1 nova_compute[183083]: 2026-01-26 08:49:00.273 183087 DEBUG nova.network.neutron [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Successfully updated port: 994fa7a8-2db9-494e-abe7-000d619a08bb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:49:00 compute-1 nova_compute[183083]: 2026-01-26 08:49:00.375 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:49:00 compute-1 nova_compute[183083]: 2026-01-26 08:49:00.376 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:49:00 compute-1 nova_compute[183083]: 2026-01-26 08:49:00.510 183087 DEBUG nova.compute.manager [req-c640e4cc-7ffe-409d-9c6a-185471de878d req-ba0a490e-95da-45c8-a696-891bf008a2e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Received event network-changed-994fa7a8-2db9-494e-abe7-000d619a08bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:49:00 compute-1 nova_compute[183083]: 2026-01-26 08:49:00.510 183087 DEBUG nova.compute.manager [req-c640e4cc-7ffe-409d-9c6a-185471de878d req-ba0a490e-95da-45c8-a696-891bf008a2e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Refreshing instance network info cache due to event network-changed-994fa7a8-2db9-494e-abe7-000d619a08bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:49:00 compute-1 nova_compute[183083]: 2026-01-26 08:49:00.511 183087 DEBUG oslo_concurrency.lockutils [req-c640e4cc-7ffe-409d-9c6a-185471de878d req-ba0a490e-95da-45c8-a696-891bf008a2e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-92bac713-364f-4aa1-a1e7-27829b917cf0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:49:00 compute-1 nova_compute[183083]: 2026-01-26 08:49:00.511 183087 DEBUG oslo_concurrency.lockutils [req-c640e4cc-7ffe-409d-9c6a-185471de878d req-ba0a490e-95da-45c8-a696-891bf008a2e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-92bac713-364f-4aa1-a1e7-27829b917cf0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:49:00 compute-1 nova_compute[183083]: 2026-01-26 08:49:00.511 183087 DEBUG nova.network.neutron [req-c640e4cc-7ffe-409d-9c6a-185471de878d req-ba0a490e-95da-45c8-a696-891bf008a2e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Refreshing network info cache for port 994fa7a8-2db9-494e-abe7-000d619a08bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:49:00 compute-1 nova_compute[183083]: 2026-01-26 08:49:00.739 183087 DEBUG nova.network.neutron [req-c640e4cc-7ffe-409d-9c6a-185471de878d req-ba0a490e-95da-45c8-a696-891bf008a2e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:49:00 compute-1 nova_compute[183083]: 2026-01-26 08:49:00.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:49:00 compute-1 nova_compute[183083]: 2026-01-26 08:49:00.971 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:49:00 compute-1 nova_compute[183083]: 2026-01-26 08:49:00.972 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:49:00 compute-1 nova_compute[183083]: 2026-01-26 08:49:00.972 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:49:00 compute-1 nova_compute[183083]: 2026-01-26 08:49:00.990 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 26 08:49:00 compute-1 nova_compute[183083]: 2026-01-26 08:49:00.991 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 08:49:00 compute-1 nova_compute[183083]: 2026-01-26 08:49:00.992 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:49:01 compute-1 nova_compute[183083]: 2026-01-26 08:49:01.068 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769417326.0676322, addae953-8eb4-46ed-959d-3c2bb6b31ee3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:49:01 compute-1 nova_compute[183083]: 2026-01-26 08:49:01.069 183087 INFO nova.compute.manager [-] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] VM Stopped (Lifecycle Event)
Jan 26 08:49:01 compute-1 nova_compute[183083]: 2026-01-26 08:49:01.158 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:01 compute-1 nova_compute[183083]: 2026-01-26 08:49:01.159 183087 DEBUG nova.compute.manager [None req-6050857c-162a-46fe-ab8a-532e27fbb99d - - - - - -] [instance: addae953-8eb4-46ed-959d-3c2bb6b31ee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:49:01 compute-1 nova_compute[183083]: 2026-01-26 08:49:01.442 183087 DEBUG nova.network.neutron [req-c640e4cc-7ffe-409d-9c6a-185471de878d req-ba0a490e-95da-45c8-a696-891bf008a2e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:49:01 compute-1 nova_compute[183083]: 2026-01-26 08:49:01.459 183087 DEBUG oslo_concurrency.lockutils [req-c640e4cc-7ffe-409d-9c6a-185471de878d req-ba0a490e-95da-45c8-a696-891bf008a2e5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-92bac713-364f-4aa1-a1e7-27829b917cf0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:49:02 compute-1 nova_compute[183083]: 2026-01-26 08:49:02.723 183087 DEBUG nova.network.neutron [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Successfully updated port: ff47496b-1060-4aba-9e1f-c22f684463e9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:49:02 compute-1 nova_compute[183083]: 2026-01-26 08:49:02.739 183087 DEBUG oslo_concurrency.lockutils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquiring lock "refresh_cache-92bac713-364f-4aa1-a1e7-27829b917cf0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:49:02 compute-1 nova_compute[183083]: 2026-01-26 08:49:02.740 183087 DEBUG oslo_concurrency.lockutils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquired lock "refresh_cache-92bac713-364f-4aa1-a1e7-27829b917cf0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:49:02 compute-1 nova_compute[183083]: 2026-01-26 08:49:02.740 183087 DEBUG nova.network.neutron [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:49:02 compute-1 nova_compute[183083]: 2026-01-26 08:49:02.907 183087 DEBUG nova.compute.manager [req-3526cfef-4890-48af-afb4-9775e5cc432c req-1380dd55-d242-4c2a-addc-0f5c5d7e77dc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Received event network-changed-ff47496b-1060-4aba-9e1f-c22f684463e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:49:02 compute-1 nova_compute[183083]: 2026-01-26 08:49:02.908 183087 DEBUG nova.compute.manager [req-3526cfef-4890-48af-afb4-9775e5cc432c req-1380dd55-d242-4c2a-addc-0f5c5d7e77dc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Refreshing instance network info cache due to event network-changed-ff47496b-1060-4aba-9e1f-c22f684463e9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:49:02 compute-1 nova_compute[183083]: 2026-01-26 08:49:02.908 183087 DEBUG oslo_concurrency.lockutils [req-3526cfef-4890-48af-afb4-9775e5cc432c req-1380dd55-d242-4c2a-addc-0f5c5d7e77dc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-92bac713-364f-4aa1-a1e7-27829b917cf0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:49:02 compute-1 nova_compute[183083]: 2026-01-26 08:49:02.991 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:49:03 compute-1 nova_compute[183083]: 2026-01-26 08:49:03.089 183087 DEBUG nova.network.neutron [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:49:03 compute-1 ovn_controller[95352]: 2026-01-26T08:49:03Z|00171|pinctrl|WARN|Dropped 13357 log messages in last 59 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 26 08:49:03 compute-1 ovn_controller[95352]: 2026-01-26T08:49:03Z|00172|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:49:03 compute-1 nova_compute[183083]: 2026-01-26 08:49:03.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:49:03 compute-1 nova_compute[183083]: 2026-01-26 08:49:03.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:49:04 compute-1 sshd-session[215485]: Accepted publickey for zuul from 38.102.83.66 port 34668 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:49:04 compute-1 systemd-logind[788]: New session 37 of user zuul.
Jan 26 08:49:04 compute-1 systemd[1]: Started Session 37 of User zuul.
Jan 26 08:49:04 compute-1 sshd-session[215485]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:49:04 compute-1 sshd-session[215488]: Connection closed by 38.102.83.66 port 34668
Jan 26 08:49:04 compute-1 sshd-session[215485]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:49:04 compute-1 systemd[1]: session-37.scope: Deactivated successfully.
Jan 26 08:49:04 compute-1 systemd-logind[788]: Session 37 logged out. Waiting for processes to exit.
Jan 26 08:49:04 compute-1 systemd-logind[788]: Removed session 37.
Jan 26 08:49:04 compute-1 nova_compute[183083]: 2026-01-26 08:49:04.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:49:04 compute-1 nova_compute[183083]: 2026-01-26 08:49:04.953 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:49:04 compute-1 nova_compute[183083]: 2026-01-26 08:49:04.953 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:49:04 compute-1 nova_compute[183083]: 2026-01-26 08:49:04.983 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:49:04 compute-1 nova_compute[183083]: 2026-01-26 08:49:04.984 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:49:04 compute-1 nova_compute[183083]: 2026-01-26 08:49:04.984 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:49:04 compute-1 nova_compute[183083]: 2026-01-26 08:49:04.985 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:49:05 compute-1 nova_compute[183083]: 2026-01-26 08:49:05.206 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:49:05 compute-1 nova_compute[183083]: 2026-01-26 08:49:05.207 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13769MB free_disk=113.0986557006836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:49:05 compute-1 nova_compute[183083]: 2026-01-26 08:49:05.207 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:49:05 compute-1 nova_compute[183083]: 2026-01-26 08:49:05.207 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:49:05 compute-1 nova_compute[183083]: 2026-01-26 08:49:05.295 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:49:05.300 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:49:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:49:05.301 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:49:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:49:05.301 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:49:05 compute-1 nova_compute[183083]: 2026-01-26 08:49:05.357 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance 92bac713-364f-4aa1-a1e7-27829b917cf0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 08:49:05 compute-1 nova_compute[183083]: 2026-01-26 08:49:05.357 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:49:05 compute-1 nova_compute[183083]: 2026-01-26 08:49:05.357 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:49:05 compute-1 nova_compute[183083]: 2026-01-26 08:49:05.434 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:49:05 compute-1 nova_compute[183083]: 2026-01-26 08:49:05.453 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:49:05 compute-1 nova_compute[183083]: 2026-01-26 08:49:05.481 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:49:05 compute-1 nova_compute[183083]: 2026-01-26 08:49:05.482 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.275s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:49:05 compute-1 nova_compute[183083]: 2026-01-26 08:49:05.745 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:05 compute-1 nova_compute[183083]: 2026-01-26 08:49:05.942 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:06 compute-1 nova_compute[183083]: 2026-01-26 08:49:06.158 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.093 183087 DEBUG nova.network.neutron [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Updating instance_info_cache with network_info: [{"id": "994fa7a8-2db9-494e-abe7-000d619a08bb", "address": "fa:16:3e:0a:67:cf", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994fa7a8-2d", "ovs_interfaceid": "994fa7a8-2db9-494e-abe7-000d619a08bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "ff47496b-1060-4aba-9e1f-c22f684463e9", "address": "fa:16:3e:b0:47:83", "network": {"id": "5e2c9a2e-ae57-4c1b-a305-7111e9f126ac", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff47496b-10", "ovs_interfaceid": "ff47496b-1060-4aba-9e1f-c22f684463e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.147 183087 DEBUG oslo_concurrency.lockutils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Releasing lock "refresh_cache-92bac713-364f-4aa1-a1e7-27829b917cf0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.148 183087 DEBUG nova.compute.manager [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Instance network_info: |[{"id": "994fa7a8-2db9-494e-abe7-000d619a08bb", "address": "fa:16:3e:0a:67:cf", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994fa7a8-2d", "ovs_interfaceid": "994fa7a8-2db9-494e-abe7-000d619a08bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "ff47496b-1060-4aba-9e1f-c22f684463e9", "address": "fa:16:3e:b0:47:83", "network": {"id": "5e2c9a2e-ae57-4c1b-a305-7111e9f126ac", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff47496b-10", "ovs_interfaceid": "ff47496b-1060-4aba-9e1f-c22f684463e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.148 183087 DEBUG oslo_concurrency.lockutils [req-3526cfef-4890-48af-afb4-9775e5cc432c req-1380dd55-d242-4c2a-addc-0f5c5d7e77dc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-92bac713-364f-4aa1-a1e7-27829b917cf0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.148 183087 DEBUG nova.network.neutron [req-3526cfef-4890-48af-afb4-9775e5cc432c req-1380dd55-d242-4c2a-addc-0f5c5d7e77dc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Refreshing network info cache for port ff47496b-1060-4aba-9e1f-c22f684463e9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.149 183087 INFO nova.compute.manager [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Terminating instance
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.150 183087 DEBUG nova.compute.manager [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.155 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.156 183087 INFO nova.virt.libvirt.driver [-] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Instance destroyed successfully.
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.157 183087 DEBUG nova.virt.libvirt.vif [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:48:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-405827926-0',display_name='server-tempest-MultiPortVlanTransparencyTest-405827926-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-405827926-0',id=35,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLNC0niFIeKCfmMP15K31NI0Yt2/3arIz4qkOmJrDjDcT91Y+Z8t7cONdYZJsdWgcENTB4qxVqhqRFnFV/+0btrVE9vH3aNVT4hvrQf4Wf4VfrioZkKAghvtRLwxtntaPQ==',key_name='tempest-MultiPortVlanTransparencyTest-405827926',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9680eb9addb04171b834f7dff8ec602d',ramdisk_id='',reservation_id='r-9nxovnxb',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-179273341',owner_user_name='tempest-MultiPortVlanTransparencyTest-179273341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:48:55Z,user_data=None,user_id='10286fb05759476d8b51274239727a28',uuid=92bac713-364f-4aa1-a1e7-27829b917cf0,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "994fa7a8-2db9-494e-abe7-000d619a08bb", "address": "fa:16:3e:0a:67:cf", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994fa7a8-2d", "ovs_interfaceid": "994fa7a8-2db9-494e-abe7-000d619a08bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.157 183087 DEBUG nova.network.os_vif_util [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converting VIF {"id": "994fa7a8-2db9-494e-abe7-000d619a08bb", "address": "fa:16:3e:0a:67:cf", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994fa7a8-2d", "ovs_interfaceid": "994fa7a8-2db9-494e-abe7-000d619a08bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.158 183087 DEBUG nova.network.os_vif_util [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:67:cf,bridge_name='br-int',has_traffic_filtering=True,id=994fa7a8-2db9-494e-abe7-000d619a08bb,network=Network(65753891-6165-49db-aedc-996760bb86fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap994fa7a8-2d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.159 183087 DEBUG os_vif [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:67:cf,bridge_name='br-int',has_traffic_filtering=True,id=994fa7a8-2db9-494e-abe7-000d619a08bb,network=Network(65753891-6165-49db-aedc-996760bb86fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap994fa7a8-2d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.161 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.162 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap994fa7a8-2d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.162 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.166 183087 INFO os_vif [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:67:cf,bridge_name='br-int',has_traffic_filtering=True,id=994fa7a8-2db9-494e-abe7-000d619a08bb,network=Network(65753891-6165-49db-aedc-996760bb86fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap994fa7a8-2d')
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.167 183087 DEBUG nova.virt.libvirt.vif [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:48:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-405827926-0',display_name='server-tempest-MultiPortVlanTransparencyTest-405827926-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-405827926-0',id=35,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLNC0niFIeKCfmMP15K31NI0Yt2/3arIz4qkOmJrDjDcT91Y+Z8t7cONdYZJsdWgcENTB4qxVqhqRFnFV/+0btrVE9vH3aNVT4hvrQf4Wf4VfrioZkKAghvtRLwxtntaPQ==',key_name='tempest-MultiPortVlanTransparencyTest-405827926',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9680eb9addb04171b834f7dff8ec602d',ramdisk_id='',reservation_id='r-9nxovnxb',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-179273341',owner_user_name='tempest-MultiPortVlanTransparencyTest-179273341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:48:55Z,user_data=None,user_id='10286fb05759476d8b51274239727a28',uuid=92bac713-364f-4aa1-a1e7-27829b917cf0,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ff47496b-1060-4aba-9e1f-c22f684463e9", "address": "fa:16:3e:b0:47:83", "network": {"id": "5e2c9a2e-ae57-4c1b-a305-7111e9f126ac", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff47496b-10", "ovs_interfaceid": "ff47496b-1060-4aba-9e1f-c22f684463e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.167 183087 DEBUG nova.network.os_vif_util [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converting VIF {"id": "ff47496b-1060-4aba-9e1f-c22f684463e9", "address": "fa:16:3e:b0:47:83", "network": {"id": "5e2c9a2e-ae57-4c1b-a305-7111e9f126ac", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff47496b-10", "ovs_interfaceid": "ff47496b-1060-4aba-9e1f-c22f684463e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.168 183087 DEBUG nova.network.os_vif_util [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:47:83,bridge_name='br-int',has_traffic_filtering=True,id=ff47496b-1060-4aba-9e1f-c22f684463e9,network=Network(5e2c9a2e-ae57-4c1b-a305-7111e9f126ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapff47496b-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.168 183087 DEBUG os_vif [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:47:83,bridge_name='br-int',has_traffic_filtering=True,id=ff47496b-1060-4aba-9e1f-c22f684463e9,network=Network(5e2c9a2e-ae57-4c1b-a305-7111e9f126ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapff47496b-10') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.169 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.170 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff47496b-10, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.170 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.172 183087 INFO os_vif [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:47:83,bridge_name='br-int',has_traffic_filtering=True,id=ff47496b-1060-4aba-9e1f-c22f684463e9,network=Network(5e2c9a2e-ae57-4c1b-a305-7111e9f126ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapff47496b-10')
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.173 183087 INFO nova.virt.libvirt.driver [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Deleting instance files /var/lib/nova/instances/92bac713-364f-4aa1-a1e7-27829b917cf0_del
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.173 183087 INFO nova.virt.libvirt.driver [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Deletion of /var/lib/nova/instances/92bac713-364f-4aa1-a1e7-27829b917cf0_del complete
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.350 183087 INFO nova.compute.manager [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Took 0.20 seconds to destroy the instance on the hypervisor.
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.351 183087 DEBUG nova.compute.claims [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Aborting claim: <nova.compute.claims.Claim object at 0x7f6cc406ecd0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.351 183087 DEBUG oslo_concurrency.lockutils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.351 183087 DEBUG oslo_concurrency.lockutils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.513 183087 DEBUG nova.compute.provider_tree [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.538 183087 DEBUG nova.scheduler.client.report [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.578 183087 DEBUG oslo_concurrency.lockutils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.579 183087 DEBUG nova.compute.utils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.580 183087 ERROR nova.compute.manager [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Build of instance 92bac713-364f-4aa1-a1e7-27829b917cf0 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 92bac713-364f-4aa1-a1e7-27829b917cf0 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.581 183087 DEBUG nova.compute.manager [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.582 183087 DEBUG nova.virt.libvirt.vif [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:48:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-405827926-0',display_name='server-tempest-MultiPortVlanTransparencyTest-405827926-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='server-tempest-multiportvlantransparencytest-405827926-0',id=35,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLNC0niFIeKCfmMP15K31NI0Yt2/3arIz4qkOmJrDjDcT91Y+Z8t7cONdYZJsdWgcENTB4qxVqhqRFnFV/+0btrVE9vH3aNVT4hvrQf4Wf4VfrioZkKAghvtRLwxtntaPQ==',key_name='tempest-MultiPortVlanTransparencyTest-405827926',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9680eb9addb04171b834f7dff8ec602d',ramdisk_id='',reservation_id='r-9nxovnxb',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-179273341',owner_user_name='tempest-MultiPortVlanTransparencyTest-179273341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:49:07Z,user_data=None,user_id='10286fb05759476d8b51274239727a28',uuid=92bac713-364f-4aa1-a1e7-27829b917cf0,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "994fa7a8-2db9-494e-abe7-000d619a08bb", "address": "fa:16:3e:0a:67:cf", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994fa7a8-2d", "ovs_interfaceid": "994fa7a8-2db9-494e-abe7-000d619a08bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.582 183087 DEBUG nova.network.os_vif_util [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converting VIF {"id": "994fa7a8-2db9-494e-abe7-000d619a08bb", "address": "fa:16:3e:0a:67:cf", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994fa7a8-2d", "ovs_interfaceid": "994fa7a8-2db9-494e-abe7-000d619a08bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.583 183087 DEBUG nova.network.os_vif_util [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:67:cf,bridge_name='br-int',has_traffic_filtering=True,id=994fa7a8-2db9-494e-abe7-000d619a08bb,network=Network(65753891-6165-49db-aedc-996760bb86fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap994fa7a8-2d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.583 183087 DEBUG os_vif [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:67:cf,bridge_name='br-int',has_traffic_filtering=True,id=994fa7a8-2db9-494e-abe7-000d619a08bb,network=Network(65753891-6165-49db-aedc-996760bb86fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap994fa7a8-2d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.585 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.585 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap994fa7a8-2d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.585 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.588 183087 INFO os_vif [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:67:cf,bridge_name='br-int',has_traffic_filtering=True,id=994fa7a8-2db9-494e-abe7-000d619a08bb,network=Network(65753891-6165-49db-aedc-996760bb86fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap994fa7a8-2d')
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.589 183087 DEBUG nova.virt.libvirt.vif [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:48:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-405827926-0',display_name='server-tempest-MultiPortVlanTransparencyTest-405827926-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='server-tempest-multiportvlantransparencytest-405827926-0',id=35,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLNC0niFIeKCfmMP15K31NI0Yt2/3arIz4qkOmJrDjDcT91Y+Z8t7cONdYZJsdWgcENTB4qxVqhqRFnFV/+0btrVE9vH3aNVT4hvrQf4Wf4VfrioZkKAghvtRLwxtntaPQ==',key_name='tempest-MultiPortVlanTransparencyTest-405827926',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9680eb9addb04171b834f7dff8ec602d',ramdisk_id='',reservation_id='r-9nxovnxb',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-179273341',owner_user_name='tempest-MultiPortVlanTransparencyTest-179273341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:49:07Z,user_data=None,user_id='10286fb05759476d8b51274239727a28',uuid=92bac713-364f-4aa1-a1e7-27829b917cf0,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ff47496b-1060-4aba-9e1f-c22f684463e9", "address": "fa:16:3e:b0:47:83", "network": {"id": "5e2c9a2e-ae57-4c1b-a305-7111e9f126ac", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff47496b-10", "ovs_interfaceid": "ff47496b-1060-4aba-9e1f-c22f684463e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.589 183087 DEBUG nova.network.os_vif_util [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converting VIF {"id": "ff47496b-1060-4aba-9e1f-c22f684463e9", "address": "fa:16:3e:b0:47:83", "network": {"id": "5e2c9a2e-ae57-4c1b-a305-7111e9f126ac", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff47496b-10", "ovs_interfaceid": "ff47496b-1060-4aba-9e1f-c22f684463e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.590 183087 DEBUG nova.network.os_vif_util [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:47:83,bridge_name='br-int',has_traffic_filtering=True,id=ff47496b-1060-4aba-9e1f-c22f684463e9,network=Network(5e2c9a2e-ae57-4c1b-a305-7111e9f126ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapff47496b-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.590 183087 DEBUG os_vif [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:47:83,bridge_name='br-int',has_traffic_filtering=True,id=ff47496b-1060-4aba-9e1f-c22f684463e9,network=Network(5e2c9a2e-ae57-4c1b-a305-7111e9f126ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapff47496b-10') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.591 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.592 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff47496b-10, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.592 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.594 183087 INFO os_vif [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:47:83,bridge_name='br-int',has_traffic_filtering=True,id=ff47496b-1060-4aba-9e1f-c22f684463e9,network=Network(5e2c9a2e-ae57-4c1b-a305-7111e9f126ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapff47496b-10')
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.594 183087 DEBUG nova.compute.manager [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.594 183087 DEBUG nova.compute.manager [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:49:07 compute-1 nova_compute[183083]: 2026-01-26 08:49:07.595 183087 DEBUG nova.network.neutron [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:49:07 compute-1 sshd-session[215514]: Accepted publickey for zuul from 38.102.83.66 port 34678 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:49:08 compute-1 systemd-logind[788]: New session 38 of user zuul.
Jan 26 08:49:08 compute-1 systemd[1]: Started Session 38 of User zuul.
Jan 26 08:49:08 compute-1 sshd-session[215514]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:49:08 compute-1 podman[215517]: 2026-01-26 08:49:08.143168581 +0000 UTC m=+0.087579208 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 08:49:08 compute-1 sshd-session[215518]: Connection closed by 38.102.83.66 port 34678
Jan 26 08:49:08 compute-1 sshd-session[215514]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:49:08 compute-1 systemd[1]: session-38.scope: Deactivated successfully.
Jan 26 08:49:08 compute-1 systemd-logind[788]: Session 38 logged out. Waiting for processes to exit.
Jan 26 08:49:08 compute-1 systemd-logind[788]: Removed session 38.
Jan 26 08:49:09 compute-1 nova_compute[183083]: 2026-01-26 08:49:09.652 183087 DEBUG nova.network.neutron [req-3526cfef-4890-48af-afb4-9775e5cc432c req-1380dd55-d242-4c2a-addc-0f5c5d7e77dc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Updated VIF entry in instance network info cache for port ff47496b-1060-4aba-9e1f-c22f684463e9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:49:09 compute-1 nova_compute[183083]: 2026-01-26 08:49:09.653 183087 DEBUG nova.network.neutron [req-3526cfef-4890-48af-afb4-9775e5cc432c req-1380dd55-d242-4c2a-addc-0f5c5d7e77dc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Updating instance_info_cache with network_info: [{"id": "994fa7a8-2db9-494e-abe7-000d619a08bb", "address": "fa:16:3e:0a:67:cf", "network": {"id": "65753891-6165-49db-aedc-996760bb86fa", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap994fa7a8-2d", "ovs_interfaceid": "994fa7a8-2db9-494e-abe7-000d619a08bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "ff47496b-1060-4aba-9e1f-c22f684463e9", "address": "fa:16:3e:b0:47:83", "network": {"id": "5e2c9a2e-ae57-4c1b-a305-7111e9f126ac", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-405827926", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9680eb9addb04171b834f7dff8ec602d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff47496b-10", "ovs_interfaceid": "ff47496b-1060-4aba-9e1f-c22f684463e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:49:09 compute-1 nova_compute[183083]: 2026-01-26 08:49:09.672 183087 DEBUG oslo_concurrency.lockutils [req-3526cfef-4890-48af-afb4-9775e5cc432c req-1380dd55-d242-4c2a-addc-0f5c5d7e77dc 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-92bac713-364f-4aa1-a1e7-27829b917cf0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:49:10 compute-1 nova_compute[183083]: 2026-01-26 08:49:10.297 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:11 compute-1 nova_compute[183083]: 2026-01-26 08:49:11.048 183087 DEBUG nova.network.neutron [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:49:11 compute-1 nova_compute[183083]: 2026-01-26 08:49:11.072 183087 INFO nova.compute.manager [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] [instance: 92bac713-364f-4aa1-a1e7-27829b917cf0] Took 3.48 seconds to deallocate network for instance.
Jan 26 08:49:11 compute-1 nova_compute[183083]: 2026-01-26 08:49:11.161 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:11 compute-1 nova_compute[183083]: 2026-01-26 08:49:11.491 183087 INFO nova.scheduler.client.report [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Deleted allocations for instance 92bac713-364f-4aa1-a1e7-27829b917cf0
Jan 26 08:49:11 compute-1 nova_compute[183083]: 2026-01-26 08:49:11.491 183087 DEBUG oslo_concurrency.lockutils [None req-2fae9dbc-b6b7-4f11-aefd-f621ccb7a910 10286fb05759476d8b51274239727a28 9680eb9addb04171b834f7dff8ec602d - - default default] Lock "92bac713-364f-4aa1-a1e7-27829b917cf0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.956s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:49:12 compute-1 nova_compute[183083]: 2026-01-26 08:49:12.223 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:12 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:49:12.224 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:49:12 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:49:12.226 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:49:13 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:49:13.230 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:49:15 compute-1 nova_compute[183083]: 2026-01-26 08:49:15.299 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:16 compute-1 nova_compute[183083]: 2026-01-26 08:49:16.164 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:20 compute-1 nova_compute[183083]: 2026-01-26 08:49:20.301 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:21 compute-1 nova_compute[183083]: 2026-01-26 08:49:21.165 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:22 compute-1 podman[215566]: 2026-01-26 08:49:22.814474497 +0000 UTC m=+0.076582859 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 08:49:22 compute-1 podman[215565]: 2026-01-26 08:49:22.816045351 +0000 UTC m=+0.076102225 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 26 08:49:25 compute-1 nova_compute[183083]: 2026-01-26 08:49:25.347 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:26 compute-1 nova_compute[183083]: 2026-01-26 08:49:26.168 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:28 compute-1 podman[215603]: 2026-01-26 08:49:28.796356009 +0000 UTC m=+0.053797450 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 08:49:28 compute-1 podman[215604]: 2026-01-26 08:49:28.826876695 +0000 UTC m=+0.084082310 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 08:49:28 compute-1 podman[215602]: 2026-01-26 08:49:28.861898447 +0000 UTC m=+0.122454506 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 08:49:30 compute-1 nova_compute[183083]: 2026-01-26 08:49:30.388 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:31 compute-1 nova_compute[183083]: 2026-01-26 08:49:31.170 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:35 compute-1 nova_compute[183083]: 2026-01-26 08:49:35.435 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:36 compute-1 nova_compute[183083]: 2026-01-26 08:49:36.171 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:38 compute-1 podman[215664]: 2026-01-26 08:49:38.793731722 +0000 UTC m=+0.061066944 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 08:49:40 compute-1 nova_compute[183083]: 2026-01-26 08:49:40.436 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:41 compute-1 nova_compute[183083]: 2026-01-26 08:49:41.173 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:45 compute-1 nova_compute[183083]: 2026-01-26 08:49:45.439 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:46 compute-1 nova_compute[183083]: 2026-01-26 08:49:46.209 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:47 compute-1 sshd-session[215688]: Connection closed by authenticating user root 159.223.236.81 port 36088 [preauth]
Jan 26 08:49:50 compute-1 nova_compute[183083]: 2026-01-26 08:49:50.496 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:50 compute-1 nova_compute[183083]: 2026-01-26 08:49:50.676 183087 DEBUG oslo_concurrency.lockutils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Acquiring lock "67099b59-9ad1-4727-9740-5e77bf568713" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:49:50 compute-1 nova_compute[183083]: 2026-01-26 08:49:50.676 183087 DEBUG oslo_concurrency.lockutils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Lock "67099b59-9ad1-4727-9740-5e77bf568713" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:49:50 compute-1 nova_compute[183083]: 2026-01-26 08:49:50.694 183087 DEBUG nova.compute.manager [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:49:50 compute-1 nova_compute[183083]: 2026-01-26 08:49:50.771 183087 DEBUG oslo_concurrency.lockutils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:49:50 compute-1 nova_compute[183083]: 2026-01-26 08:49:50.772 183087 DEBUG oslo_concurrency.lockutils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:49:50 compute-1 nova_compute[183083]: 2026-01-26 08:49:50.784 183087 DEBUG nova.virt.hardware [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:49:50 compute-1 nova_compute[183083]: 2026-01-26 08:49:50.785 183087 INFO nova.compute.claims [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:49:50 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:49:50.909 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:49:50 compute-1 nova_compute[183083]: 2026-01-26 08:49:50.910 183087 DEBUG nova.compute.provider_tree [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:49:50 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:49:50.911 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:49:50 compute-1 nova_compute[183083]: 2026-01-26 08:49:50.913 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:50 compute-1 nova_compute[183083]: 2026-01-26 08:49:50.928 183087 DEBUG nova.scheduler.client.report [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:49:50 compute-1 nova_compute[183083]: 2026-01-26 08:49:50.953 183087 DEBUG oslo_concurrency.lockutils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:49:50 compute-1 nova_compute[183083]: 2026-01-26 08:49:50.953 183087 DEBUG nova.compute.manager [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:49:51 compute-1 nova_compute[183083]: 2026-01-26 08:49:50.999 183087 DEBUG nova.compute.manager [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:49:51 compute-1 nova_compute[183083]: 2026-01-26 08:49:51.000 183087 DEBUG nova.network.neutron [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:49:51 compute-1 nova_compute[183083]: 2026-01-26 08:49:51.026 183087 INFO nova.virt.libvirt.driver [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:49:51 compute-1 nova_compute[183083]: 2026-01-26 08:49:51.051 183087 DEBUG nova.compute.manager [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:49:51 compute-1 nova_compute[183083]: 2026-01-26 08:49:51.136 183087 DEBUG nova.compute.manager [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:49:51 compute-1 nova_compute[183083]: 2026-01-26 08:49:51.137 183087 DEBUG nova.virt.libvirt.driver [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:49:51 compute-1 nova_compute[183083]: 2026-01-26 08:49:51.137 183087 INFO nova.virt.libvirt.driver [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Creating image(s)
Jan 26 08:49:51 compute-1 nova_compute[183083]: 2026-01-26 08:49:51.138 183087 DEBUG oslo_concurrency.lockutils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Acquiring lock "/var/lib/nova/instances/67099b59-9ad1-4727-9740-5e77bf568713/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:49:51 compute-1 nova_compute[183083]: 2026-01-26 08:49:51.138 183087 DEBUG oslo_concurrency.lockutils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Lock "/var/lib/nova/instances/67099b59-9ad1-4727-9740-5e77bf568713/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:49:51 compute-1 nova_compute[183083]: 2026-01-26 08:49:51.138 183087 DEBUG oslo_concurrency.lockutils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Lock "/var/lib/nova/instances/67099b59-9ad1-4727-9740-5e77bf568713/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:49:51 compute-1 nova_compute[183083]: 2026-01-26 08:49:51.139 183087 DEBUG oslo_concurrency.lockutils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:49:51 compute-1 nova_compute[183083]: 2026-01-26 08:49:51.139 183087 DEBUG oslo_concurrency.lockutils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:49:51 compute-1 nova_compute[183083]: 2026-01-26 08:49:51.211 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.047 183087 DEBUG nova.policy [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '340c37bdf64a4d1d9b456d4cc9ee67f1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c7cffe7bd6aa4f77a47fbcb1a2ff7ac8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.672 183087 DEBUG oslo_concurrency.lockutils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.533s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Traceback (most recent call last):
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]     raise exception.ImageUnacceptable(
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713] 
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713] During handling of the above exception, another exception occurred:
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713] 
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Traceback (most recent call last):
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]     yield resources
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]     created_disks = self._create_and_inject_local_root(
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]     image.cache(fetch_func=fetch_func,
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]     return f(*args, **kwargs)
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713]     raise exception.ImageUnacceptable(
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:49:52 compute-1 nova_compute[183083]: 2026-01-26 08:49:52.673 183087 ERROR nova.compute.manager [instance: 67099b59-9ad1-4727-9740-5e77bf568713] 
Jan 26 08:49:53 compute-1 podman[215690]: 2026-01-26 08:49:53.821942221 +0000 UTC m=+0.081430235 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 08:49:53 compute-1 podman[215691]: 2026-01-26 08:49:53.838955188 +0000 UTC m=+0.091293262 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Jan 26 08:49:55 compute-1 nova_compute[183083]: 2026-01-26 08:49:55.498 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:55 compute-1 nova_compute[183083]: 2026-01-26 08:49:55.740 183087 DEBUG nova.network.neutron [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Successfully updated port: 57658b15-8c84-4ffc-9fe8-85ab62317c09 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:49:55 compute-1 nova_compute[183083]: 2026-01-26 08:49:55.754 183087 DEBUG oslo_concurrency.lockutils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Acquiring lock "refresh_cache-67099b59-9ad1-4727-9740-5e77bf568713" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:49:55 compute-1 nova_compute[183083]: 2026-01-26 08:49:55.755 183087 DEBUG oslo_concurrency.lockutils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Acquired lock "refresh_cache-67099b59-9ad1-4727-9740-5e77bf568713" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:49:55 compute-1 nova_compute[183083]: 2026-01-26 08:49:55.755 183087 DEBUG nova.network.neutron [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:49:55 compute-1 nova_compute[183083]: 2026-01-26 08:49:55.952 183087 DEBUG nova.compute.manager [req-4e93008b-1f74-4e3d-9110-54af281c5040 req-4122670d-ecb1-44b1-b0bd-5fae1eae3335 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Received event network-changed-57658b15-8c84-4ffc-9fe8-85ab62317c09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:49:55 compute-1 nova_compute[183083]: 2026-01-26 08:49:55.952 183087 DEBUG nova.compute.manager [req-4e93008b-1f74-4e3d-9110-54af281c5040 req-4122670d-ecb1-44b1-b0bd-5fae1eae3335 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Refreshing instance network info cache due to event network-changed-57658b15-8c84-4ffc-9fe8-85ab62317c09. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:49:55 compute-1 nova_compute[183083]: 2026-01-26 08:49:55.953 183087 DEBUG oslo_concurrency.lockutils [req-4e93008b-1f74-4e3d-9110-54af281c5040 req-4122670d-ecb1-44b1-b0bd-5fae1eae3335 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-67099b59-9ad1-4727-9740-5e77bf568713" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:49:56 compute-1 nova_compute[183083]: 2026-01-26 08:49:56.250 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:56 compute-1 nova_compute[183083]: 2026-01-26 08:49:56.417 183087 DEBUG nova.network.neutron [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.676 183087 DEBUG nova.network.neutron [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Updating instance_info_cache with network_info: [{"id": "57658b15-8c84-4ffc-9fe8-85ab62317c09", "address": "fa:16:3e:4a:85:29", "network": {"id": "77252c98-ab3c-49c9-bdb9-d2063f751d4d", "bridge": "br-int", "label": "tempest-test-network--375174067", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7cffe7bd6aa4f77a47fbcb1a2ff7ac8", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57658b15-8c", "ovs_interfaceid": "57658b15-8c84-4ffc-9fe8-85ab62317c09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.698 183087 DEBUG oslo_concurrency.lockutils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Releasing lock "refresh_cache-67099b59-9ad1-4727-9740-5e77bf568713" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.699 183087 DEBUG nova.compute.manager [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Instance network_info: |[{"id": "57658b15-8c84-4ffc-9fe8-85ab62317c09", "address": "fa:16:3e:4a:85:29", "network": {"id": "77252c98-ab3c-49c9-bdb9-d2063f751d4d", "bridge": "br-int", "label": "tempest-test-network--375174067", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7cffe7bd6aa4f77a47fbcb1a2ff7ac8", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57658b15-8c", "ovs_interfaceid": "57658b15-8c84-4ffc-9fe8-85ab62317c09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.699 183087 DEBUG oslo_concurrency.lockutils [req-4e93008b-1f74-4e3d-9110-54af281c5040 req-4122670d-ecb1-44b1-b0bd-5fae1eae3335 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-67099b59-9ad1-4727-9740-5e77bf568713" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.700 183087 DEBUG nova.network.neutron [req-4e93008b-1f74-4e3d-9110-54af281c5040 req-4122670d-ecb1-44b1-b0bd-5fae1eae3335 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Refreshing network info cache for port 57658b15-8c84-4ffc-9fe8-85ab62317c09 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.701 183087 INFO nova.compute.manager [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Terminating instance
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.703 183087 DEBUG nova.compute.manager [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.707 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.707 183087 INFO nova.virt.libvirt.driver [-] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Instance destroyed successfully.
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.708 183087 DEBUG nova.virt.libvirt.vif [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:49:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='vm1',display_name='vm1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='vm1',id=36,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMMetvTa7QluDHKvqmZef9/cWIFtddyFsPSAEtPFHT7XUlqXbjGCY4eig6GLH0R/j6fx2uu4KfcsSCJdPuD9KoGKQIQi8JkIeebw/TH+NvGSQSLgTlW9E/Yv3lHGM6kFQg==',key_name='tempest-keypair-test-1329149192',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c7cffe7bd6aa4f77a47fbcb1a2ff7ac8',ramdisk_id='',reservation_id='r-dprh36ms',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VrrpTest-1965160934',owner_user_name='tempest-VrrpTest-1965160934-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:49:51Z,user_data=None,user_id='340c37bdf64a4d1d9b456d4cc9ee67f1',uuid=67099b59-9ad1-4727-9740-5e77bf568713,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57658b15-8c84-4ffc-9fe8-85ab62317c09", "address": "fa:16:3e:4a:85:29", "network": {"id": "77252c98-ab3c-49c9-bdb9-d2063f751d4d", "bridge": "br-int", "label": "tempest-test-network--375174067", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7cffe7bd6aa4f77a47fbcb1a2ff7ac8", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57658b15-8c", "ovs_interfaceid": "57658b15-8c84-4ffc-9fe8-85ab62317c09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.708 183087 DEBUG nova.network.os_vif_util [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Converting VIF {"id": "57658b15-8c84-4ffc-9fe8-85ab62317c09", "address": "fa:16:3e:4a:85:29", "network": {"id": "77252c98-ab3c-49c9-bdb9-d2063f751d4d", "bridge": "br-int", "label": "tempest-test-network--375174067", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7cffe7bd6aa4f77a47fbcb1a2ff7ac8", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57658b15-8c", "ovs_interfaceid": "57658b15-8c84-4ffc-9fe8-85ab62317c09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.709 183087 DEBUG nova.network.os_vif_util [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:85:29,bridge_name='br-int',has_traffic_filtering=True,id=57658b15-8c84-4ffc-9fe8-85ab62317c09,network=Network(77252c98-ab3c-49c9-bdb9-d2063f751d4d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap57658b15-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.709 183087 DEBUG os_vif [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:85:29,bridge_name='br-int',has_traffic_filtering=True,id=57658b15-8c84-4ffc-9fe8-85ab62317c09,network=Network(77252c98-ab3c-49c9-bdb9-d2063f751d4d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap57658b15-8c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.711 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.711 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57658b15-8c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.711 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.717 183087 INFO os_vif [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:85:29,bridge_name='br-int',has_traffic_filtering=True,id=57658b15-8c84-4ffc-9fe8-85ab62317c09,network=Network(77252c98-ab3c-49c9-bdb9-d2063f751d4d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap57658b15-8c')
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.718 183087 INFO nova.virt.libvirt.driver [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Deleting instance files /var/lib/nova/instances/67099b59-9ad1-4727-9740-5e77bf568713_del
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.718 183087 INFO nova.virt.libvirt.driver [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Deletion of /var/lib/nova/instances/67099b59-9ad1-4727-9740-5e77bf568713_del complete
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.768 183087 INFO nova.compute.manager [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Took 0.07 seconds to destroy the instance on the hypervisor.
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.770 183087 DEBUG nova.compute.claims [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Aborting claim: <nova.compute.claims.Claim object at 0x7f6cb8716d60> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.770 183087 DEBUG oslo_concurrency.lockutils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.771 183087 DEBUG oslo_concurrency.lockutils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.882 183087 DEBUG nova.compute.provider_tree [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.902 183087 DEBUG nova.scheduler.client.report [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.933 183087 DEBUG oslo_concurrency.lockutils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.934 183087 DEBUG nova.compute.utils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.935 183087 ERROR nova.compute.manager [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Build of instance 67099b59-9ad1-4727-9740-5e77bf568713 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 67099b59-9ad1-4727-9740-5e77bf568713 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.936 183087 DEBUG nova.compute.manager [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.937 183087 DEBUG nova.virt.libvirt.vif [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:49:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='vm1',display_name='vm1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='vm1',id=36,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMMetvTa7QluDHKvqmZef9/cWIFtddyFsPSAEtPFHT7XUlqXbjGCY4eig6GLH0R/j6fx2uu4KfcsSCJdPuD9KoGKQIQi8JkIeebw/TH+NvGSQSLgTlW9E/Yv3lHGM6kFQg==',key_name='tempest-keypair-test-1329149192',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c7cffe7bd6aa4f77a47fbcb1a2ff7ac8',ramdisk_id='',reservation_id='r-dprh36ms',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VrrpTest-1965160934',owner_user_name='tempest-VrrpTest-1965160934-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:49:58Z,user_data=None,user_id='340c37bdf64a4d1d9b456d4cc9ee67f1',uuid=67099b59-9ad1-4727-9740-5e77bf568713,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57658b15-8c84-4ffc-9fe8-85ab62317c09", "address": "fa:16:3e:4a:85:29", "network": {"id": "77252c98-ab3c-49c9-bdb9-d2063f751d4d", "bridge": "br-int", "label": "tempest-test-network--375174067", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7cffe7bd6aa4f77a47fbcb1a2ff7ac8", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57658b15-8c", "ovs_interfaceid": "57658b15-8c84-4ffc-9fe8-85ab62317c09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.938 183087 DEBUG nova.network.os_vif_util [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Converting VIF {"id": "57658b15-8c84-4ffc-9fe8-85ab62317c09", "address": "fa:16:3e:4a:85:29", "network": {"id": "77252c98-ab3c-49c9-bdb9-d2063f751d4d", "bridge": "br-int", "label": "tempest-test-network--375174067", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7cffe7bd6aa4f77a47fbcb1a2ff7ac8", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57658b15-8c", "ovs_interfaceid": "57658b15-8c84-4ffc-9fe8-85ab62317c09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.939 183087 DEBUG nova.network.os_vif_util [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:85:29,bridge_name='br-int',has_traffic_filtering=True,id=57658b15-8c84-4ffc-9fe8-85ab62317c09,network=Network(77252c98-ab3c-49c9-bdb9-d2063f751d4d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap57658b15-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.940 183087 DEBUG os_vif [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:85:29,bridge_name='br-int',has_traffic_filtering=True,id=57658b15-8c84-4ffc-9fe8-85ab62317c09,network=Network(77252c98-ab3c-49c9-bdb9-d2063f751d4d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap57658b15-8c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.942 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.942 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57658b15-8c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.943 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.946 183087 INFO os_vif [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:85:29,bridge_name='br-int',has_traffic_filtering=True,id=57658b15-8c84-4ffc-9fe8-85ab62317c09,network=Network(77252c98-ab3c-49c9-bdb9-d2063f751d4d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap57658b15-8c')
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.947 183087 DEBUG nova.compute.manager [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.947 183087 DEBUG nova.compute.manager [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:49:58 compute-1 nova_compute[183083]: 2026-01-26 08:49:58.948 183087 DEBUG nova.network.neutron [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:49:59 compute-1 podman[215730]: 2026-01-26 08:49:59.801046787 +0000 UTC m=+0.057957146 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 08:49:59 compute-1 podman[215729]: 2026-01-26 08:49:59.825119682 +0000 UTC m=+0.078495222 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 08:49:59 compute-1 podman[215728]: 2026-01-26 08:49:59.854504997 +0000 UTC m=+0.114590166 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 08:50:00 compute-1 nova_compute[183083]: 2026-01-26 08:50:00.184 183087 DEBUG nova.network.neutron [req-4e93008b-1f74-4e3d-9110-54af281c5040 req-4122670d-ecb1-44b1-b0bd-5fae1eae3335 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Updated VIF entry in instance network info cache for port 57658b15-8c84-4ffc-9fe8-85ab62317c09. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:50:00 compute-1 nova_compute[183083]: 2026-01-26 08:50:00.185 183087 DEBUG nova.network.neutron [req-4e93008b-1f74-4e3d-9110-54af281c5040 req-4122670d-ecb1-44b1-b0bd-5fae1eae3335 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Updating instance_info_cache with network_info: [{"id": "57658b15-8c84-4ffc-9fe8-85ab62317c09", "address": "fa:16:3e:4a:85:29", "network": {"id": "77252c98-ab3c-49c9-bdb9-d2063f751d4d", "bridge": "br-int", "label": "tempest-test-network--375174067", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7cffe7bd6aa4f77a47fbcb1a2ff7ac8", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57658b15-8c", "ovs_interfaceid": "57658b15-8c84-4ffc-9fe8-85ab62317c09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:50:00 compute-1 nova_compute[183083]: 2026-01-26 08:50:00.209 183087 DEBUG oslo_concurrency.lockutils [req-4e93008b-1f74-4e3d-9110-54af281c5040 req-4122670d-ecb1-44b1-b0bd-5fae1eae3335 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-67099b59-9ad1-4727-9740-5e77bf568713" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:50:00 compute-1 nova_compute[183083]: 2026-01-26 08:50:00.423 183087 DEBUG nova.network.neutron [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:50:00 compute-1 nova_compute[183083]: 2026-01-26 08:50:00.450 183087 INFO nova.compute.manager [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] [instance: 67099b59-9ad1-4727-9740-5e77bf568713] Took 1.50 seconds to deallocate network for instance.
Jan 26 08:50:00 compute-1 nova_compute[183083]: 2026-01-26 08:50:00.551 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:00.913 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:50:01 compute-1 nova_compute[183083]: 2026-01-26 08:50:01.252 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:01 compute-1 nova_compute[183083]: 2026-01-26 08:50:01.363 183087 INFO nova.scheduler.client.report [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Deleted allocations for instance 67099b59-9ad1-4727-9740-5e77bf568713
Jan 26 08:50:01 compute-1 nova_compute[183083]: 2026-01-26 08:50:01.363 183087 DEBUG oslo_concurrency.lockutils [None req-9ceb071f-416f-4426-b90a-117fb4359742 340c37bdf64a4d1d9b456d4cc9ee67f1 c7cffe7bd6aa4f77a47fbcb1a2ff7ac8 - - default default] Lock "67099b59-9ad1-4727-9740-5e77bf568713" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.687s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:01 compute-1 nova_compute[183083]: 2026-01-26 08:50:01.482 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:50:01 compute-1 nova_compute[183083]: 2026-01-26 08:50:01.482 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:50:01 compute-1 nova_compute[183083]: 2026-01-26 08:50:01.483 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:50:01 compute-1 nova_compute[183083]: 2026-01-26 08:50:01.497 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 08:50:01 compute-1 nova_compute[183083]: 2026-01-26 08:50:01.499 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:50:01 compute-1 nova_compute[183083]: 2026-01-26 08:50:01.499 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:50:02 compute-1 nova_compute[183083]: 2026-01-26 08:50:02.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:50:02 compute-1 nova_compute[183083]: 2026-01-26 08:50:02.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:50:03 compute-1 ovn_controller[95352]: 2026-01-26T08:50:03Z|00173|pinctrl|WARN|Dropped 4167 log messages in last 60 seconds (most recently, 0 seconds ago) due to excessive rate
Jan 26 08:50:03 compute-1 ovn_controller[95352]: 2026-01-26T08:50:03Z|00174|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:50:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:50:03 compute-1 ovn_controller[95352]: 2026-01-26T08:50:03Z|00175|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 26 08:50:04 compute-1 nova_compute[183083]: 2026-01-26 08:50:04.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:50:04 compute-1 nova_compute[183083]: 2026-01-26 08:50:04.975 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:04 compute-1 nova_compute[183083]: 2026-01-26 08:50:04.977 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:04 compute-1 nova_compute[183083]: 2026-01-26 08:50:04.977 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:04 compute-1 nova_compute[183083]: 2026-01-26 08:50:04.977 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:50:05 compute-1 nova_compute[183083]: 2026-01-26 08:50:05.193 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:50:05 compute-1 nova_compute[183083]: 2026-01-26 08:50:05.195 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13784MB free_disk=113.09871673583984GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:50:05 compute-1 nova_compute[183083]: 2026-01-26 08:50:05.195 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:05 compute-1 nova_compute[183083]: 2026-01-26 08:50:05.196 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:05 compute-1 nova_compute[183083]: 2026-01-26 08:50:05.263 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:50:05 compute-1 nova_compute[183083]: 2026-01-26 08:50:05.263 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:50:05 compute-1 nova_compute[183083]: 2026-01-26 08:50:05.278 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing inventories for resource provider 5203935e-446c-4e03-93fa-4c60d651e045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 26 08:50:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:05.301 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:05.302 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:05.302 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:05 compute-1 nova_compute[183083]: 2026-01-26 08:50:05.314 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating ProviderTree inventory for provider 5203935e-446c-4e03-93fa-4c60d651e045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 26 08:50:05 compute-1 nova_compute[183083]: 2026-01-26 08:50:05.314 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating inventory in ProviderTree for provider 5203935e-446c-4e03-93fa-4c60d651e045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 08:50:05 compute-1 nova_compute[183083]: 2026-01-26 08:50:05.328 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing aggregate associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 26 08:50:05 compute-1 nova_compute[183083]: 2026-01-26 08:50:05.367 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing trait associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 26 08:50:05 compute-1 nova_compute[183083]: 2026-01-26 08:50:05.391 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:50:05 compute-1 nova_compute[183083]: 2026-01-26 08:50:05.415 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:50:05 compute-1 nova_compute[183083]: 2026-01-26 08:50:05.446 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:50:05 compute-1 nova_compute[183083]: 2026-01-26 08:50:05.446 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:05 compute-1 nova_compute[183083]: 2026-01-26 08:50:05.606 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:06 compute-1 nova_compute[183083]: 2026-01-26 08:50:06.254 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:06 compute-1 nova_compute[183083]: 2026-01-26 08:50:06.447 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:50:06 compute-1 nova_compute[183083]: 2026-01-26 08:50:06.448 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:50:06 compute-1 nova_compute[183083]: 2026-01-26 08:50:06.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:50:06 compute-1 nova_compute[183083]: 2026-01-26 08:50:06.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:50:07 compute-1 nova_compute[183083]: 2026-01-26 08:50:07.974 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "b78b8713-97b3-4628-bfbc-a1069866c0b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:07 compute-1 nova_compute[183083]: 2026-01-26 08:50:07.975 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "b78b8713-97b3-4628-bfbc-a1069866c0b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.001 183087 DEBUG nova.compute.manager [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.084 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.084 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.093 183087 DEBUG nova.virt.hardware [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.093 183087 INFO nova.compute.claims [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.206 183087 DEBUG nova.compute.provider_tree [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.220 183087 DEBUG nova.scheduler.client.report [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.245 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.246 183087 DEBUG nova.compute.manager [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.294 183087 DEBUG nova.compute.manager [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.295 183087 DEBUG nova.network.neutron [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.315 183087 INFO nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.334 183087 DEBUG nova.compute.manager [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.470 183087 DEBUG nova.compute.manager [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.472 183087 DEBUG nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.473 183087 INFO nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Creating image(s)
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.474 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "/var/lib/nova/instances/b78b8713-97b3-4628-bfbc-a1069866c0b1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.474 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "/var/lib/nova/instances/b78b8713-97b3-4628-bfbc-a1069866c0b1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.475 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "/var/lib/nova/instances/b78b8713-97b3-4628-bfbc-a1069866c0b1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.499 183087 DEBUG oslo_concurrency.processutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.588 183087 DEBUG oslo_concurrency.processutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.591 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.592 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.614 183087 DEBUG oslo_concurrency.processutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.682 183087 DEBUG oslo_concurrency.processutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.684 183087 DEBUG oslo_concurrency.processutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/b78b8713-97b3-4628-bfbc-a1069866c0b1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.720 183087 DEBUG oslo_concurrency.processutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/b78b8713-97b3-4628-bfbc-a1069866c0b1/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.722 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.723 183087 DEBUG oslo_concurrency.processutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.783 183087 DEBUG oslo_concurrency.processutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.784 183087 DEBUG nova.virt.disk.api [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Checking if we can resize image /var/lib/nova/instances/b78b8713-97b3-4628-bfbc-a1069866c0b1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.785 183087 DEBUG oslo_concurrency.processutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b78b8713-97b3-4628-bfbc-a1069866c0b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.840 183087 DEBUG oslo_concurrency.processutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b78b8713-97b3-4628-bfbc-a1069866c0b1/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.841 183087 DEBUG nova.virt.disk.api [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Cannot resize image /var/lib/nova/instances/b78b8713-97b3-4628-bfbc-a1069866c0b1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.842 183087 DEBUG nova.objects.instance [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lazy-loading 'migration_context' on Instance uuid b78b8713-97b3-4628-bfbc-a1069866c0b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.856 183087 DEBUG nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.856 183087 DEBUG nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Ensure instance console log exists: /var/lib/nova/instances/b78b8713-97b3-4628-bfbc-a1069866c0b1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.857 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.858 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:08 compute-1 nova_compute[183083]: 2026-01-26 08:50:08.858 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:09 compute-1 nova_compute[183083]: 2026-01-26 08:50:09.252 183087 DEBUG nova.policy [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:50:09 compute-1 podman[215810]: 2026-01-26 08:50:09.806304591 +0000 UTC m=+0.070544350 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 08:50:10 compute-1 nova_compute[183083]: 2026-01-26 08:50:10.604 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:11 compute-1 nova_compute[183083]: 2026-01-26 08:50:11.002 183087 DEBUG nova.network.neutron [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Successfully created port: a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:50:11 compute-1 nova_compute[183083]: 2026-01-26 08:50:11.256 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:14 compute-1 nova_compute[183083]: 2026-01-26 08:50:14.439 183087 DEBUG nova.network.neutron [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Successfully updated port: a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:50:14 compute-1 nova_compute[183083]: 2026-01-26 08:50:14.459 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "refresh_cache-b78b8713-97b3-4628-bfbc-a1069866c0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:50:14 compute-1 nova_compute[183083]: 2026-01-26 08:50:14.460 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquired lock "refresh_cache-b78b8713-97b3-4628-bfbc-a1069866c0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:50:14 compute-1 nova_compute[183083]: 2026-01-26 08:50:14.460 183087 DEBUG nova.network.neutron [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:50:14 compute-1 nova_compute[183083]: 2026-01-26 08:50:14.557 183087 DEBUG nova.compute.manager [req-4c123a4a-dcd0-4fad-b64a-026ce4e61fa7 req-69b2cf24-3689-4196-a5ef-6bd9c3e275ba 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Received event network-changed-a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:50:14 compute-1 nova_compute[183083]: 2026-01-26 08:50:14.557 183087 DEBUG nova.compute.manager [req-4c123a4a-dcd0-4fad-b64a-026ce4e61fa7 req-69b2cf24-3689-4196-a5ef-6bd9c3e275ba 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Refreshing instance network info cache due to event network-changed-a23d1307-6eca-4c0d-8eb1-a3e2095f0af2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:50:14 compute-1 nova_compute[183083]: 2026-01-26 08:50:14.558 183087 DEBUG oslo_concurrency.lockutils [req-4c123a4a-dcd0-4fad-b64a-026ce4e61fa7 req-69b2cf24-3689-4196-a5ef-6bd9c3e275ba 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-b78b8713-97b3-4628-bfbc-a1069866c0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:50:14 compute-1 nova_compute[183083]: 2026-01-26 08:50:14.805 183087 DEBUG nova.network.neutron [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:50:15 compute-1 nova_compute[183083]: 2026-01-26 08:50:15.666 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:16 compute-1 nova_compute[183083]: 2026-01-26 08:50:16.257 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.578 183087 DEBUG nova.network.neutron [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Updating instance_info_cache with network_info: [{"id": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "address": "fa:16:3e:d0:36:b4", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa23d1307-6e", "ovs_interfaceid": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.839 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Releasing lock "refresh_cache-b78b8713-97b3-4628-bfbc-a1069866c0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.840 183087 DEBUG nova.compute.manager [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Instance network_info: |[{"id": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "address": "fa:16:3e:d0:36:b4", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa23d1307-6e", "ovs_interfaceid": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.841 183087 DEBUG oslo_concurrency.lockutils [req-4c123a4a-dcd0-4fad-b64a-026ce4e61fa7 req-69b2cf24-3689-4196-a5ef-6bd9c3e275ba 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-b78b8713-97b3-4628-bfbc-a1069866c0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.841 183087 DEBUG nova.network.neutron [req-4c123a4a-dcd0-4fad-b64a-026ce4e61fa7 req-69b2cf24-3689-4196-a5ef-6bd9c3e275ba 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Refreshing network info cache for port a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.846 183087 DEBUG nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Start _get_guest_xml network_info=[{"id": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "address": "fa:16:3e:d0:36:b4", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa23d1307-6e", "ovs_interfaceid": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encryption_options': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.853 183087 WARNING nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.868 183087 DEBUG nova.virt.libvirt.host [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.869 183087 DEBUG nova.virt.libvirt.host [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.873 183087 DEBUG nova.virt.libvirt.host [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.874 183087 DEBUG nova.virt.libvirt.host [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.874 183087 DEBUG nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.874 183087 DEBUG nova.virt.hardware [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T08:43:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7017323-cd35-470d-a718-faa6c6e97277',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.875 183087 DEBUG nova.virt.hardware [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.875 183087 DEBUG nova.virt.hardware [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.875 183087 DEBUG nova.virt.hardware [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.876 183087 DEBUG nova.virt.hardware [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.876 183087 DEBUG nova.virt.hardware [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.876 183087 DEBUG nova.virt.hardware [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.876 183087 DEBUG nova.virt.hardware [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.876 183087 DEBUG nova.virt.hardware [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.877 183087 DEBUG nova.virt.hardware [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.877 183087 DEBUG nova.virt.hardware [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.881 183087 DEBUG nova.virt.libvirt.vif [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:50:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-889574907',display_name='tempest-server-test-889574907',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-889574907',id=37,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDY4V6+HgiJjlBp+sismnH//xIhvaHIyZ883PC4cnliN7XgVilpp1kBxLjUHPWqW57B2esyV5Ub1B2t4CQ7Ool6UJEN5MlISleL6j0D4sAsLrzbr7gNivUWbODLUMAl3Sw==',key_name='tempest-keypair-test-1986375941',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5d0c78b7cd584e4a90592d8ea01ce4ad',ramdisk_id='',reservation_id='r-4ev95600',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkDefaultSecGroupTest-1876093813',owner_user_name='tempest-NetworkDefaultSecGroupTest-1876093813-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:50:08Z,user_data=None,user_id='988ebc31182f4c94813f94306e399a2d',uuid=b78b8713-97b3-4628-bfbc-a1069866c0b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "address": "fa:16:3e:d0:36:b4", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa23d1307-6e", "ovs_interfaceid": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.881 183087 DEBUG nova.network.os_vif_util [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converting VIF {"id": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "address": "fa:16:3e:d0:36:b4", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa23d1307-6e", "ovs_interfaceid": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.882 183087 DEBUG nova.network.os_vif_util [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:36:b4,bridge_name='br-int',has_traffic_filtering=True,id=a23d1307-6eca-4c0d-8eb1-a3e2095f0af2,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa23d1307-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:50:17 compute-1 nova_compute[183083]: 2026-01-26 08:50:17.883 183087 DEBUG nova.objects.instance [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lazy-loading 'pci_devices' on Instance uuid b78b8713-97b3-4628-bfbc-a1069866c0b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.293 183087 DEBUG nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] End _get_guest_xml xml=<domain type="kvm">
Jan 26 08:50:18 compute-1 nova_compute[183083]:   <uuid>b78b8713-97b3-4628-bfbc-a1069866c0b1</uuid>
Jan 26 08:50:18 compute-1 nova_compute[183083]:   <name>instance-00000025</name>
Jan 26 08:50:18 compute-1 nova_compute[183083]:   <memory>131072</memory>
Jan 26 08:50:18 compute-1 nova_compute[183083]:   <vcpu>1</vcpu>
Jan 26 08:50:18 compute-1 nova_compute[183083]:   <metadata>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <nova:name>tempest-server-test-889574907</nova:name>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <nova:creationTime>2026-01-26 08:50:17</nova:creationTime>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <nova:flavor name="m1.nano">
Jan 26 08:50:18 compute-1 nova_compute[183083]:         <nova:memory>128</nova:memory>
Jan 26 08:50:18 compute-1 nova_compute[183083]:         <nova:disk>1</nova:disk>
Jan 26 08:50:18 compute-1 nova_compute[183083]:         <nova:swap>0</nova:swap>
Jan 26 08:50:18 compute-1 nova_compute[183083]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 08:50:18 compute-1 nova_compute[183083]:         <nova:vcpus>1</nova:vcpus>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       </nova:flavor>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <nova:owner>
Jan 26 08:50:18 compute-1 nova_compute[183083]:         <nova:user uuid="988ebc31182f4c94813f94306e399a2d">tempest-NetworkDefaultSecGroupTest-1876093813-project-member</nova:user>
Jan 26 08:50:18 compute-1 nova_compute[183083]:         <nova:project uuid="5d0c78b7cd584e4a90592d8ea01ce4ad">tempest-NetworkDefaultSecGroupTest-1876093813</nova:project>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       </nova:owner>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <nova:root type="image" uuid="13d1a20a-8003-4f19-aba7-ccbd9eff9b82"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <nova:ports>
Jan 26 08:50:18 compute-1 nova_compute[183083]:         <nova:port uuid="a23d1307-6eca-4c0d-8eb1-a3e2095f0af2">
Jan 26 08:50:18 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       </nova:ports>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     </nova:instance>
Jan 26 08:50:18 compute-1 nova_compute[183083]:   </metadata>
Jan 26 08:50:18 compute-1 nova_compute[183083]:   <sysinfo type="smbios">
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <system>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <entry name="manufacturer">RDO</entry>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <entry name="product">OpenStack Compute</entry>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <entry name="serial">b78b8713-97b3-4628-bfbc-a1069866c0b1</entry>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <entry name="uuid">b78b8713-97b3-4628-bfbc-a1069866c0b1</entry>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <entry name="family">Virtual Machine</entry>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     </system>
Jan 26 08:50:18 compute-1 nova_compute[183083]:   </sysinfo>
Jan 26 08:50:18 compute-1 nova_compute[183083]:   <os>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <boot dev="hd"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <smbios mode="sysinfo"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:   </os>
Jan 26 08:50:18 compute-1 nova_compute[183083]:   <features>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <acpi/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <apic/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <vmcoreinfo/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:   </features>
Jan 26 08:50:18 compute-1 nova_compute[183083]:   <clock offset="utc">
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <timer name="hpet" present="no"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:   </clock>
Jan 26 08:50:18 compute-1 nova_compute[183083]:   <cpu mode="host-model" match="exact">
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:   </cpu>
Jan 26 08:50:18 compute-1 nova_compute[183083]:   <devices>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <disk type="file" device="disk">
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/b78b8713-97b3-4628-bfbc-a1069866c0b1/disk"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <target dev="vda" bus="virtio"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <disk type="file" device="cdrom">
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/b78b8713-97b3-4628-bfbc-a1069866c0b1/disk.config"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <target dev="sda" bus="sata"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:d0:36:b4"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <target dev="tapa23d1307-6e"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <serial type="pty">
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <log file="/var/lib/nova/instances/b78b8713-97b3-4628-bfbc-a1069866c0b1/console.log" append="off"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     </serial>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <video>
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     </video>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <input type="tablet" bus="usb"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <rng model="virtio">
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <backend model="random">/dev/urandom</backend>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     </rng>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <controller type="usb" index="0"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     <memballoon model="virtio">
Jan 26 08:50:18 compute-1 nova_compute[183083]:       <stats period="10"/>
Jan 26 08:50:18 compute-1 nova_compute[183083]:     </memballoon>
Jan 26 08:50:18 compute-1 nova_compute[183083]:   </devices>
Jan 26 08:50:18 compute-1 nova_compute[183083]: </domain>
Jan 26 08:50:18 compute-1 nova_compute[183083]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.295 183087 DEBUG nova.compute.manager [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Preparing to wait for external event network-vif-plugged-a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.296 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "b78b8713-97b3-4628-bfbc-a1069866c0b1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.296 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "b78b8713-97b3-4628-bfbc-a1069866c0b1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.296 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "b78b8713-97b3-4628-bfbc-a1069866c0b1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.298 183087 DEBUG nova.virt.libvirt.vif [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:50:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-889574907',display_name='tempest-server-test-889574907',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-889574907',id=37,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDY4V6+HgiJjlBp+sismnH//xIhvaHIyZ883PC4cnliN7XgVilpp1kBxLjUHPWqW57B2esyV5Ub1B2t4CQ7Ool6UJEN5MlISleL6j0D4sAsLrzbr7gNivUWbODLUMAl3Sw==',key_name='tempest-keypair-test-1986375941',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5d0c78b7cd584e4a90592d8ea01ce4ad',ramdisk_id='',reservation_id='r-4ev95600',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkDefaultSecGroupTest-1876093813',owner_user_name='tempest-NetworkDefaultSecGroupTest-1876093813-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:50:08Z,user_data=None,user_id='988ebc31182f4c94813f94306e399a2d',uuid=b78b8713-97b3-4628-bfbc-a1069866c0b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "address": "fa:16:3e:d0:36:b4", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa23d1307-6e", "ovs_interfaceid": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.298 183087 DEBUG nova.network.os_vif_util [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converting VIF {"id": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "address": "fa:16:3e:d0:36:b4", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa23d1307-6e", "ovs_interfaceid": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.299 183087 DEBUG nova.network.os_vif_util [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:36:b4,bridge_name='br-int',has_traffic_filtering=True,id=a23d1307-6eca-4c0d-8eb1-a3e2095f0af2,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa23d1307-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.300 183087 DEBUG os_vif [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:36:b4,bridge_name='br-int',has_traffic_filtering=True,id=a23d1307-6eca-4c0d-8eb1-a3e2095f0af2,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa23d1307-6e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.301 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.301 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.302 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.306 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.306 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa23d1307-6e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.307 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa23d1307-6e, col_values=(('external_ids', {'iface-id': 'a23d1307-6eca-4c0d-8eb1-a3e2095f0af2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d0:36:b4', 'vm-uuid': 'b78b8713-97b3-4628-bfbc-a1069866c0b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.309 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:18 compute-1 NetworkManager[55451]: <info>  [1769417418.3100] manager: (tapa23d1307-6e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.311 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.318 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.320 183087 INFO os_vif [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:36:b4,bridge_name='br-int',has_traffic_filtering=True,id=a23d1307-6eca-4c0d-8eb1-a3e2095f0af2,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa23d1307-6e')
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.609 183087 DEBUG nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.609 183087 DEBUG nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.610 183087 DEBUG nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] No VIF found with MAC fa:16:3e:d0:36:b4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 08:50:18 compute-1 nova_compute[183083]: 2026-01-26 08:50:18.610 183087 INFO nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Using config drive
Jan 26 08:50:19 compute-1 nova_compute[183083]: 2026-01-26 08:50:19.700 183087 INFO nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Creating config drive at /var/lib/nova/instances/b78b8713-97b3-4628-bfbc-a1069866c0b1/disk.config
Jan 26 08:50:19 compute-1 nova_compute[183083]: 2026-01-26 08:50:19.708 183087 DEBUG oslo_concurrency.processutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b78b8713-97b3-4628-bfbc-a1069866c0b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7ypn9gpw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:50:19 compute-1 nova_compute[183083]: 2026-01-26 08:50:19.839 183087 DEBUG oslo_concurrency.processutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b78b8713-97b3-4628-bfbc-a1069866c0b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7ypn9gpw" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:50:19 compute-1 kernel: tapa23d1307-6e: entered promiscuous mode
Jan 26 08:50:19 compute-1 NetworkManager[55451]: <info>  [1769417419.9079] manager: (tapa23d1307-6e): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Jan 26 08:50:19 compute-1 nova_compute[183083]: 2026-01-26 08:50:19.908 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:19 compute-1 ovn_controller[95352]: 2026-01-26T08:50:19Z|00176|binding|INFO|Claiming lport a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 for this chassis.
Jan 26 08:50:19 compute-1 ovn_controller[95352]: 2026-01-26T08:50:19Z|00177|binding|INFO|a23d1307-6eca-4c0d-8eb1-a3e2095f0af2: Claiming fa:16:3e:d0:36:b4 10.100.0.12
Jan 26 08:50:19 compute-1 nova_compute[183083]: 2026-01-26 08:50:19.912 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:19 compute-1 nova_compute[183083]: 2026-01-26 08:50:19.915 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:19 compute-1 systemd-udevd[215852]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:50:19 compute-1 systemd-machined[154360]: New machine qemu-9-instance-00000025.
Jan 26 08:50:19 compute-1 NetworkManager[55451]: <info>  [1769417419.9485] device (tapa23d1307-6e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 08:50:19 compute-1 NetworkManager[55451]: <info>  [1769417419.9493] device (tapa23d1307-6e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 08:50:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:19.958 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:36:b4 10.100.0.12'], port_security=['fa:16:3e:d0:36:b4 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'b78b8713-97b3-4628-bfbc-a1069866c0b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2dbe868-e71c-4f41-b88e-0fe68163ca46', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97b37269-bb63-498a-80e2-0f1154aea97c, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=a23d1307-6eca-4c0d-8eb1-a3e2095f0af2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:50:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:19.959 104632 INFO neutron.agent.ovn.metadata.agent [-] Port a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 in datapath ce3cd186-bdaf-40d4-a276-e9139fe3dfec bound to our chassis
Jan 26 08:50:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:19.960 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ce3cd186-bdaf-40d4-a276-e9139fe3dfec
Jan 26 08:50:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:19.971 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[74ce8f1b-fa09-4a03-b668-c0c54d95585d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:19.971 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapce3cd186-b1 in ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 08:50:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:19.973 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapce3cd186-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 08:50:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:19.973 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[3616093a-99ab-4a64-85cc-989de7bf9f5b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:19.973 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[21988cef-76a0-4987-9b53-885ce26ffb97]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:19 compute-1 nova_compute[183083]: 2026-01-26 08:50:19.980 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:19 compute-1 ovn_controller[95352]: 2026-01-26T08:50:19Z|00178|binding|INFO|Setting lport a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 ovn-installed in OVS
Jan 26 08:50:19 compute-1 ovn_controller[95352]: 2026-01-26T08:50:19Z|00179|binding|INFO|Setting lport a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 up in Southbound
Jan 26 08:50:19 compute-1 nova_compute[183083]: 2026-01-26 08:50:19.984 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:19 compute-1 systemd[1]: Started Virtual Machine qemu-9-instance-00000025.
Jan 26 08:50:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:19.987 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[eb84030a-c36d-4cb4-9ff4-7b35b896e0f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.001 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[8347f591-c965-4c95-bc20-3709da615096]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.032 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[093c14d0-1119-4b32-a4ed-cfc3b153fa52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:20 compute-1 systemd-udevd[215855]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:50:20 compute-1 NetworkManager[55451]: <info>  [1769417420.0395] manager: (tapce3cd186-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/65)
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.038 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[ca3c816b-1e1d-4a23-be8e-4848d0e50edf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.066 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[55c7a5da-a38f-41d3-a721-9404a5b3e2ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.069 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[44b97e80-32e4-4122-8a4f-6cc93d40dca1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:20 compute-1 NetworkManager[55451]: <info>  [1769417420.0953] device (tapce3cd186-b0): carrier: link connected
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.101 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[507383f7-132a-4357-9355-79ce1c976ae3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.114 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[1bb35ad5-60f2-4ccd-9f31-1b90484431cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce3cd186-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:90:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376070, 'reachable_time': 26456, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215887, 'error': None, 'target': 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.136 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[cb12f8dc-9169-41c5-8783-acf1a11c1052]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:9011'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376070, 'tstamp': 376070}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215888, 'error': None, 'target': 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.149 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[32cee584-f1fe-4907-b4f1-7e6a6b1662cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce3cd186-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:90:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376070, 'reachable_time': 26456, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215889, 'error': None, 'target': 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.195 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[43f19706-6efb-4754-be7a-4b603f3497d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.272 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[41724a4a-eb8f-4e45-95d8-b706626641c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.274 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce3cd186-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.274 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.275 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce3cd186-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:50:20 compute-1 nova_compute[183083]: 2026-01-26 08:50:20.316 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:20 compute-1 kernel: tapce3cd186-b0: entered promiscuous mode
Jan 26 08:50:20 compute-1 NetworkManager[55451]: <info>  [1769417420.3193] manager: (tapce3cd186-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Jan 26 08:50:20 compute-1 nova_compute[183083]: 2026-01-26 08:50:20.320 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.320 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapce3cd186-b0, col_values=(('external_ids', {'iface-id': '597f7eff-a379-4a5d-bffc-0e294ae89e61'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:50:20 compute-1 ovn_controller[95352]: 2026-01-26T08:50:20Z|00180|binding|INFO|Releasing lport 597f7eff-a379-4a5d-bffc-0e294ae89e61 from this chassis (sb_readonly=0)
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.323 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ce3cd186-bdaf-40d4-a276-e9139fe3dfec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ce3cd186-bdaf-40d4-a276-e9139fe3dfec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.324 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[ca81d5f9-fdba-4fac-a0fa-8f5f556bdb1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.325 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: global
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-ce3cd186-bdaf-40d4-a276-e9139fe3dfec
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/ce3cd186-bdaf-40d4-a276-e9139fe3dfec.pid.haproxy
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID ce3cd186-bdaf-40d4-a276-e9139fe3dfec
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 08:50:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:20.325 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'env', 'PROCESS_TAG=haproxy-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ce3cd186-bdaf-40d4-a276-e9139fe3dfec.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 08:50:20 compute-1 nova_compute[183083]: 2026-01-26 08:50:20.334 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:20 compute-1 nova_compute[183083]: 2026-01-26 08:50:20.355 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417420.3549986, b78b8713-97b3-4628-bfbc-a1069866c0b1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:50:20 compute-1 nova_compute[183083]: 2026-01-26 08:50:20.355 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] VM Started (Lifecycle Event)
Jan 26 08:50:20 compute-1 nova_compute[183083]: 2026-01-26 08:50:20.550 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:50:20 compute-1 nova_compute[183083]: 2026-01-26 08:50:20.556 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417420.3560984, b78b8713-97b3-4628-bfbc-a1069866c0b1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:50:20 compute-1 nova_compute[183083]: 2026-01-26 08:50:20.557 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] VM Paused (Lifecycle Event)
Jan 26 08:50:20 compute-1 nova_compute[183083]: 2026-01-26 08:50:20.582 183087 DEBUG nova.network.neutron [req-4c123a4a-dcd0-4fad-b64a-026ce4e61fa7 req-69b2cf24-3689-4196-a5ef-6bd9c3e275ba 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Updated VIF entry in instance network info cache for port a23d1307-6eca-4c0d-8eb1-a3e2095f0af2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:50:20 compute-1 nova_compute[183083]: 2026-01-26 08:50:20.583 183087 DEBUG nova.network.neutron [req-4c123a4a-dcd0-4fad-b64a-026ce4e61fa7 req-69b2cf24-3689-4196-a5ef-6bd9c3e275ba 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Updating instance_info_cache with network_info: [{"id": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "address": "fa:16:3e:d0:36:b4", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa23d1307-6e", "ovs_interfaceid": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:50:20 compute-1 nova_compute[183083]: 2026-01-26 08:50:20.669 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:20 compute-1 podman[215926]: 2026-01-26 08:50:20.764440329 +0000 UTC m=+0.045284581 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 08:50:20 compute-1 nova_compute[183083]: 2026-01-26 08:50:20.902 183087 DEBUG oslo_concurrency.lockutils [req-4c123a4a-dcd0-4fad-b64a-026ce4e61fa7 req-69b2cf24-3689-4196-a5ef-6bd9c3e275ba 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-b78b8713-97b3-4628-bfbc-a1069866c0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:50:20 compute-1 nova_compute[183083]: 2026-01-26 08:50:20.905 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:50:20 compute-1 nova_compute[183083]: 2026-01-26 08:50:20.909 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:50:20 compute-1 podman[215926]: 2026-01-26 08:50:20.926934797 +0000 UTC m=+0.207779009 container create bd3aef42eab9a6f7134923b619328b44e998fc1892eb75759740b3d0e503199a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 08:50:20 compute-1 nova_compute[183083]: 2026-01-26 08:50:20.949 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:50:20 compute-1 systemd[1]: Started libpod-conmon-bd3aef42eab9a6f7134923b619328b44e998fc1892eb75759740b3d0e503199a.scope.
Jan 26 08:50:20 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:50:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f74617d93f879125201a1be399173b60509b8b39ed122369669ed16773642fa9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 08:50:21 compute-1 podman[215926]: 2026-01-26 08:50:21.051374098 +0000 UTC m=+0.332218370 container init bd3aef42eab9a6f7134923b619328b44e998fc1892eb75759740b3d0e503199a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 08:50:21 compute-1 podman[215926]: 2026-01-26 08:50:21.058762265 +0000 UTC m=+0.339606447 container start bd3aef42eab9a6f7134923b619328b44e998fc1892eb75759740b3d0e503199a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 08:50:21 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[215942]: [NOTICE]   (215946) : New worker (215948) forked
Jan 26 08:50:21 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[215942]: [NOTICE]   (215946) : Loading success.
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.152 183087 DEBUG nova.compute.manager [req-fcc70fb2-c373-4b6c-bd1e-af543a3cca11 req-12aec8f2-2517-49c1-ad25-af591bfd8cca 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Received event network-vif-plugged-a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.153 183087 DEBUG oslo_concurrency.lockutils [req-fcc70fb2-c373-4b6c-bd1e-af543a3cca11 req-12aec8f2-2517-49c1-ad25-af591bfd8cca 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "b78b8713-97b3-4628-bfbc-a1069866c0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.154 183087 DEBUG oslo_concurrency.lockutils [req-fcc70fb2-c373-4b6c-bd1e-af543a3cca11 req-12aec8f2-2517-49c1-ad25-af591bfd8cca 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "b78b8713-97b3-4628-bfbc-a1069866c0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.155 183087 DEBUG oslo_concurrency.lockutils [req-fcc70fb2-c373-4b6c-bd1e-af543a3cca11 req-12aec8f2-2517-49c1-ad25-af591bfd8cca 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "b78b8713-97b3-4628-bfbc-a1069866c0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.156 183087 DEBUG nova.compute.manager [req-fcc70fb2-c373-4b6c-bd1e-af543a3cca11 req-12aec8f2-2517-49c1-ad25-af591bfd8cca 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Processing event network-vif-plugged-a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.158 183087 DEBUG nova.compute.manager [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.162 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417421.1618624, b78b8713-97b3-4628-bfbc-a1069866c0b1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.162 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] VM Resumed (Lifecycle Event)
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.165 183087 DEBUG nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.169 183087 INFO nova.virt.libvirt.driver [-] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Instance spawned successfully.
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.170 183087 DEBUG nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.208 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.212 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.248 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.251 183087 DEBUG nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.251 183087 DEBUG nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.252 183087 DEBUG nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.253 183087 DEBUG nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.254 183087 DEBUG nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.254 183087 DEBUG nova.virt.libvirt.driver [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.471 183087 INFO nova.compute.manager [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Took 13.00 seconds to spawn the instance on the hypervisor.
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.471 183087 DEBUG nova.compute.manager [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.743 183087 INFO nova.compute.manager [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Took 13.68 seconds to build instance.
Jan 26 08:50:21 compute-1 sshd-session[215957]: Connection closed by 2.57.122.238 port 48156
Jan 26 08:50:21 compute-1 nova_compute[183083]: 2026-01-26 08:50:21.987 183087 DEBUG oslo_concurrency.lockutils [None req-ba11bf61-b901-46cf-9083-e78b63b4c582 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "b78b8713-97b3-4628-bfbc-a1069866c0b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:23 compute-1 nova_compute[183083]: 2026-01-26 08:50:23.309 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:23 compute-1 nova_compute[183083]: 2026-01-26 08:50:23.368 183087 DEBUG nova.compute.manager [req-b0dd95ef-88f1-4e1d-9938-c73a2439f98a req-2da7befc-842c-443d-83e8-865a0cbda1d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Received event network-vif-plugged-a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:50:23 compute-1 nova_compute[183083]: 2026-01-26 08:50:23.368 183087 DEBUG oslo_concurrency.lockutils [req-b0dd95ef-88f1-4e1d-9938-c73a2439f98a req-2da7befc-842c-443d-83e8-865a0cbda1d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "b78b8713-97b3-4628-bfbc-a1069866c0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:23 compute-1 nova_compute[183083]: 2026-01-26 08:50:23.369 183087 DEBUG oslo_concurrency.lockutils [req-b0dd95ef-88f1-4e1d-9938-c73a2439f98a req-2da7befc-842c-443d-83e8-865a0cbda1d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "b78b8713-97b3-4628-bfbc-a1069866c0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:23 compute-1 nova_compute[183083]: 2026-01-26 08:50:23.369 183087 DEBUG oslo_concurrency.lockutils [req-b0dd95ef-88f1-4e1d-9938-c73a2439f98a req-2da7befc-842c-443d-83e8-865a0cbda1d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "b78b8713-97b3-4628-bfbc-a1069866c0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:23 compute-1 nova_compute[183083]: 2026-01-26 08:50:23.369 183087 DEBUG nova.compute.manager [req-b0dd95ef-88f1-4e1d-9938-c73a2439f98a req-2da7befc-842c-443d-83e8-865a0cbda1d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] No waiting events found dispatching network-vif-plugged-a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:50:23 compute-1 nova_compute[183083]: 2026-01-26 08:50:23.369 183087 WARNING nova.compute.manager [req-b0dd95ef-88f1-4e1d-9938-c73a2439f98a req-2da7befc-842c-443d-83e8-865a0cbda1d7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Received unexpected event network-vif-plugged-a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 for instance with vm_state active and task_state None.
Jan 26 08:50:23 compute-1 nova_compute[183083]: 2026-01-26 08:50:23.582 183087 INFO nova.compute.manager [None req-81707ae1-073d-49c1-b774-ed9f46968255 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Get console output
Jan 26 08:50:23 compute-1 nova_compute[183083]: 2026-01-26 08:50:23.589 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:50:24 compute-1 podman[215958]: 2026-01-26 08:50:24.819989012 +0000 UTC m=+0.086616761 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 08:50:24 compute-1 podman[215959]: 2026-01-26 08:50:24.845487207 +0000 UTC m=+0.107026193 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., release=1755695350, config_id=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 08:50:25 compute-1 nova_compute[183083]: 2026-01-26 08:50:25.155 183087 DEBUG oslo_concurrency.lockutils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Acquiring lock "311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:25 compute-1 nova_compute[183083]: 2026-01-26 08:50:25.156 183087 DEBUG oslo_concurrency.lockutils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Lock "311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:25 compute-1 nova_compute[183083]: 2026-01-26 08:50:25.182 183087 DEBUG nova.compute.manager [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:50:25 compute-1 nova_compute[183083]: 2026-01-26 08:50:25.461 183087 DEBUG oslo_concurrency.lockutils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:25 compute-1 nova_compute[183083]: 2026-01-26 08:50:25.462 183087 DEBUG oslo_concurrency.lockutils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:25 compute-1 nova_compute[183083]: 2026-01-26 08:50:25.467 183087 DEBUG nova.virt.hardware [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:50:25 compute-1 nova_compute[183083]: 2026-01-26 08:50:25.468 183087 INFO nova.compute.claims [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:50:25 compute-1 nova_compute[183083]: 2026-01-26 08:50:25.671 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:25 compute-1 nova_compute[183083]: 2026-01-26 08:50:25.779 183087 DEBUG nova.compute.provider_tree [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:50:25 compute-1 nova_compute[183083]: 2026-01-26 08:50:25.816 183087 DEBUG nova.scheduler.client.report [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:50:25 compute-1 nova_compute[183083]: 2026-01-26 08:50:25.947 183087 DEBUG oslo_concurrency.lockutils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.486s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:25 compute-1 nova_compute[183083]: 2026-01-26 08:50:25.949 183087 DEBUG nova.compute.manager [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:50:26 compute-1 nova_compute[183083]: 2026-01-26 08:50:26.118 183087 DEBUG nova.compute.manager [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:50:26 compute-1 nova_compute[183083]: 2026-01-26 08:50:26.120 183087 DEBUG nova.network.neutron [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:50:26 compute-1 nova_compute[183083]: 2026-01-26 08:50:26.231 183087 INFO nova.virt.libvirt.driver [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:50:26 compute-1 nova_compute[183083]: 2026-01-26 08:50:26.346 183087 DEBUG nova.compute.manager [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:50:26 compute-1 nova_compute[183083]: 2026-01-26 08:50:26.985 183087 DEBUG nova.compute.manager [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:50:26 compute-1 nova_compute[183083]: 2026-01-26 08:50:26.987 183087 DEBUG nova.virt.libvirt.driver [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:50:26 compute-1 nova_compute[183083]: 2026-01-26 08:50:26.988 183087 INFO nova.virt.libvirt.driver [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Creating image(s)
Jan 26 08:50:26 compute-1 nova_compute[183083]: 2026-01-26 08:50:26.989 183087 DEBUG oslo_concurrency.lockutils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Acquiring lock "/var/lib/nova/instances/311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:26 compute-1 nova_compute[183083]: 2026-01-26 08:50:26.990 183087 DEBUG oslo_concurrency.lockutils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Lock "/var/lib/nova/instances/311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:26 compute-1 nova_compute[183083]: 2026-01-26 08:50:26.991 183087 DEBUG oslo_concurrency.lockutils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Lock "/var/lib/nova/instances/311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:26 compute-1 nova_compute[183083]: 2026-01-26 08:50:26.992 183087 DEBUG oslo_concurrency.lockutils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:26 compute-1 nova_compute[183083]: 2026-01-26 08:50:26.992 183087 DEBUG oslo_concurrency.lockutils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:27 compute-1 nova_compute[183083]: 2026-01-26 08:50:27.937 183087 DEBUG nova.policy [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '48b2be9734a34b56adf088cd89958861', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76aa13cf72c74a0a94d4c07195fc44ac', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.310 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.756 183087 INFO nova.compute.manager [None req-c617d0c9-c520-4773-aa97-560ed4c0ca39 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Get console output
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.762 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.948 183087 DEBUG nova.network.neutron [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Successfully updated port: 118b0f12-d7bc-4445-b42b-3d68d0a4ef01 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.974 183087 DEBUG oslo_concurrency.lockutils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.982s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Traceback (most recent call last):
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]     raise exception.ImageUnacceptable(
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] 
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] During handling of the above exception, another exception occurred:
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] 
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Traceback (most recent call last):
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]     yield resources
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]     created_disks = self._create_and_inject_local_root(
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]     image.cache(fetch_func=fetch_func,
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]     return f(*args, **kwargs)
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6]     raise exception.ImageUnacceptable(
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.975 183087 ERROR nova.compute.manager [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] 
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.982 183087 DEBUG oslo_concurrency.lockutils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Acquiring lock "refresh_cache-311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.983 183087 DEBUG oslo_concurrency.lockutils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Acquired lock "refresh_cache-311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:50:28 compute-1 nova_compute[183083]: 2026-01-26 08:50:28.983 183087 DEBUG nova.network.neutron [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:50:29 compute-1 nova_compute[183083]: 2026-01-26 08:50:29.073 183087 DEBUG nova.compute.manager [req-6ef68dce-1378-4143-b4bc-2b56935256aa req-997e2ef1-9905-4056-b8ab-f8ea9146bce7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Received event network-changed-118b0f12-d7bc-4445-b42b-3d68d0a4ef01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:50:29 compute-1 nova_compute[183083]: 2026-01-26 08:50:29.074 183087 DEBUG nova.compute.manager [req-6ef68dce-1378-4143-b4bc-2b56935256aa req-997e2ef1-9905-4056-b8ab-f8ea9146bce7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Refreshing instance network info cache due to event network-changed-118b0f12-d7bc-4445-b42b-3d68d0a4ef01. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:50:29 compute-1 nova_compute[183083]: 2026-01-26 08:50:29.075 183087 DEBUG oslo_concurrency.lockutils [req-6ef68dce-1378-4143-b4bc-2b56935256aa req-997e2ef1-9905-4056-b8ab-f8ea9146bce7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:50:29 compute-1 nova_compute[183083]: 2026-01-26 08:50:29.288 183087 DEBUG nova.network.neutron [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:50:30 compute-1 nova_compute[183083]: 2026-01-26 08:50:30.672 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:30 compute-1 podman[215996]: 2026-01-26 08:50:30.830026725 +0000 UTC m=+0.081772445 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 08:50:30 compute-1 podman[215995]: 2026-01-26 08:50:30.85020106 +0000 UTC m=+0.107676361 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 26 08:50:30 compute-1 podman[215997]: 2026-01-26 08:50:30.85020104 +0000 UTC m=+0.099714587 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.305 183087 DEBUG nova.network.neutron [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Updating instance_info_cache with network_info: [{"id": "118b0f12-d7bc-4445-b42b-3d68d0a4ef01", "address": "fa:16:3e:b6:5e:6b", "network": {"id": "f8ddc4c2-2dc7-4739-a079-d336f292e6ff", "bridge": "br-int", "label": "tempest-MultiVlanTransparencyTest-1193143188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76aa13cf72c74a0a94d4c07195fc44ac", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap118b0f12-d7", "ovs_interfaceid": "118b0f12-d7bc-4445-b42b-3d68d0a4ef01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.379 183087 DEBUG oslo_concurrency.lockutils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Releasing lock "refresh_cache-311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.380 183087 DEBUG nova.compute.manager [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Instance network_info: |[{"id": "118b0f12-d7bc-4445-b42b-3d68d0a4ef01", "address": "fa:16:3e:b6:5e:6b", "network": {"id": "f8ddc4c2-2dc7-4739-a079-d336f292e6ff", "bridge": "br-int", "label": "tempest-MultiVlanTransparencyTest-1193143188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76aa13cf72c74a0a94d4c07195fc44ac", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap118b0f12-d7", "ovs_interfaceid": "118b0f12-d7bc-4445-b42b-3d68d0a4ef01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.380 183087 DEBUG oslo_concurrency.lockutils [req-6ef68dce-1378-4143-b4bc-2b56935256aa req-997e2ef1-9905-4056-b8ab-f8ea9146bce7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.380 183087 DEBUG nova.network.neutron [req-6ef68dce-1378-4143-b4bc-2b56935256aa req-997e2ef1-9905-4056-b8ab-f8ea9146bce7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Refreshing network info cache for port 118b0f12-d7bc-4445-b42b-3d68d0a4ef01 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.381 183087 INFO nova.compute.manager [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Terminating instance
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.382 183087 DEBUG nova.compute.manager [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.385 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.386 183087 INFO nova.virt.libvirt.driver [-] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Instance destroyed successfully.
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.386 183087 DEBUG nova.virt.libvirt.vif [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:50:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiVlanTransparencyTest-1193143188-0',display_name='server-tempest-MultiVlanTransparencyTest-1193143188-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multivlantransparencytest-1193143188-0',id=38,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKyNrP2fRnCBdiTs3hZjtBF5k2Vq6kXVPRft3pAtU9song+IlsxTQYoSJlVrQKcztPijuOPNxQHNIZu499qKy2Y+WuMdTEWYA3c1TUqA5rVmYQFNtnqghKnSLTA4VP+p+g==',key_name='tempest-MultiVlanTransparencyTest-1193143188',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76aa13cf72c74a0a94d4c07195fc44ac',ramdisk_id='',reservation_id='r-larrzzqt',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiVlanTransparencyTest-416382724',owner_user_name='tempest-MultiVlanTransparencyTest-416382724-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:50:26Z,user_data=None,user_id='48b2be9734a34b56adf088cd89958861',uuid=311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "118b0f12-d7bc-4445-b42b-3d68d0a4ef01", "address": "fa:16:3e:b6:5e:6b", "network": {"id": "f8ddc4c2-2dc7-4739-a079-d336f292e6ff", "bridge": "br-int", "label": "tempest-MultiVlanTransparencyTest-1193143188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76aa13cf72c74a0a94d4c07195fc44ac", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap118b0f12-d7", "ovs_interfaceid": "118b0f12-d7bc-4445-b42b-3d68d0a4ef01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.387 183087 DEBUG nova.network.os_vif_util [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Converting VIF {"id": "118b0f12-d7bc-4445-b42b-3d68d0a4ef01", "address": "fa:16:3e:b6:5e:6b", "network": {"id": "f8ddc4c2-2dc7-4739-a079-d336f292e6ff", "bridge": "br-int", "label": "tempest-MultiVlanTransparencyTest-1193143188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76aa13cf72c74a0a94d4c07195fc44ac", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap118b0f12-d7", "ovs_interfaceid": "118b0f12-d7bc-4445-b42b-3d68d0a4ef01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.387 183087 DEBUG nova.network.os_vif_util [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:5e:6b,bridge_name='br-int',has_traffic_filtering=True,id=118b0f12-d7bc-4445-b42b-3d68d0a4ef01,network=Network(f8ddc4c2-2dc7-4739-a079-d336f292e6ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap118b0f12-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.387 183087 DEBUG os_vif [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:5e:6b,bridge_name='br-int',has_traffic_filtering=True,id=118b0f12-d7bc-4445-b42b-3d68d0a4ef01,network=Network(f8ddc4c2-2dc7-4739-a079-d336f292e6ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap118b0f12-d7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.389 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.389 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap118b0f12-d7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.389 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.391 183087 INFO os_vif [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:5e:6b,bridge_name='br-int',has_traffic_filtering=True,id=118b0f12-d7bc-4445-b42b-3d68d0a4ef01,network=Network(f8ddc4c2-2dc7-4739-a079-d336f292e6ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap118b0f12-d7')
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.392 183087 INFO nova.virt.libvirt.driver [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Deleting instance files /var/lib/nova/instances/311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6_del
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.392 183087 INFO nova.virt.libvirt.driver [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Deletion of /var/lib/nova/instances/311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6_del complete
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.442 183087 INFO nova.compute.manager [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Took 0.06 seconds to destroy the instance on the hypervisor.
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.443 183087 DEBUG nova.compute.claims [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c9821bfd0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.443 183087 DEBUG oslo_concurrency.lockutils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.444 183087 DEBUG oslo_concurrency.lockutils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.578 183087 DEBUG nova.compute.provider_tree [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.593 183087 DEBUG nova.scheduler.client.report [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.654 183087 DEBUG oslo_concurrency.lockutils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.655 183087 DEBUG nova.compute.utils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.656 183087 ERROR nova.compute.manager [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Build of instance 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.656 183087 DEBUG nova.compute.manager [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.657 183087 DEBUG nova.virt.libvirt.vif [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:50:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiVlanTransparencyTest-1193143188-0',display_name='server-tempest-MultiVlanTransparencyTest-1193143188-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='server-tempest-multivlantransparencytest-1193143188-0',id=38,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKyNrP2fRnCBdiTs3hZjtBF5k2Vq6kXVPRft3pAtU9song+IlsxTQYoSJlVrQKcztPijuOPNxQHNIZu499qKy2Y+WuMdTEWYA3c1TUqA5rVmYQFNtnqghKnSLTA4VP+p+g==',key_name='tempest-MultiVlanTransparencyTest-1193143188',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76aa13cf72c74a0a94d4c07195fc44ac',ramdisk_id='',reservation_id='r-larrzzqt',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiVlanTransparencyTest-416382724',owner_user_name='tempest-MultiVlanTransparencyTest-416382724-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:50:31Z,user_data=None,user_id='48b2be9734a34b56adf088cd89958861',uuid=311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "118b0f12-d7bc-4445-b42b-3d68d0a4ef01", "address": "fa:16:3e:b6:5e:6b", "network": {"id": "f8ddc4c2-2dc7-4739-a079-d336f292e6ff", "bridge": "br-int", "label": "tempest-MultiVlanTransparencyTest-1193143188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76aa13cf72c74a0a94d4c07195fc44ac", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap118b0f12-d7", "ovs_interfaceid": "118b0f12-d7bc-4445-b42b-3d68d0a4ef01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.658 183087 DEBUG nova.network.os_vif_util [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Converting VIF {"id": "118b0f12-d7bc-4445-b42b-3d68d0a4ef01", "address": "fa:16:3e:b6:5e:6b", "network": {"id": "f8ddc4c2-2dc7-4739-a079-d336f292e6ff", "bridge": "br-int", "label": "tempest-MultiVlanTransparencyTest-1193143188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76aa13cf72c74a0a94d4c07195fc44ac", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap118b0f12-d7", "ovs_interfaceid": "118b0f12-d7bc-4445-b42b-3d68d0a4ef01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.658 183087 DEBUG nova.network.os_vif_util [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:5e:6b,bridge_name='br-int',has_traffic_filtering=True,id=118b0f12-d7bc-4445-b42b-3d68d0a4ef01,network=Network(f8ddc4c2-2dc7-4739-a079-d336f292e6ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap118b0f12-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.659 183087 DEBUG os_vif [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:5e:6b,bridge_name='br-int',has_traffic_filtering=True,id=118b0f12-d7bc-4445-b42b-3d68d0a4ef01,network=Network(f8ddc4c2-2dc7-4739-a079-d336f292e6ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap118b0f12-d7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.660 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.660 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap118b0f12-d7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.661 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.664 183087 INFO os_vif [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:5e:6b,bridge_name='br-int',has_traffic_filtering=True,id=118b0f12-d7bc-4445-b42b-3d68d0a4ef01,network=Network(f8ddc4c2-2dc7-4739-a079-d336f292e6ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap118b0f12-d7')
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.665 183087 DEBUG nova.compute.manager [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.665 183087 DEBUG nova.compute.manager [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:50:31 compute-1 nova_compute[183083]: 2026-01-26 08:50:31.665 183087 DEBUG nova.network.neutron [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:50:31 compute-1 ovn_controller[95352]: 2026-01-26T08:50:31Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d0:36:b4 10.100.0.12
Jan 26 08:50:31 compute-1 ovn_controller[95352]: 2026-01-26T08:50:31Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d0:36:b4 10.100.0.12
Jan 26 08:50:33 compute-1 nova_compute[183083]: 2026-01-26 08:50:33.312 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:33 compute-1 nova_compute[183083]: 2026-01-26 08:50:33.783 183087 DEBUG nova.network.neutron [req-6ef68dce-1378-4143-b4bc-2b56935256aa req-997e2ef1-9905-4056-b8ab-f8ea9146bce7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Updated VIF entry in instance network info cache for port 118b0f12-d7bc-4445-b42b-3d68d0a4ef01. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:50:33 compute-1 nova_compute[183083]: 2026-01-26 08:50:33.783 183087 DEBUG nova.network.neutron [req-6ef68dce-1378-4143-b4bc-2b56935256aa req-997e2ef1-9905-4056-b8ab-f8ea9146bce7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Updating instance_info_cache with network_info: [{"id": "118b0f12-d7bc-4445-b42b-3d68d0a4ef01", "address": "fa:16:3e:b6:5e:6b", "network": {"id": "f8ddc4c2-2dc7-4739-a079-d336f292e6ff", "bridge": "br-int", "label": "tempest-MultiVlanTransparencyTest-1193143188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76aa13cf72c74a0a94d4c07195fc44ac", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap118b0f12-d7", "ovs_interfaceid": "118b0f12-d7bc-4445-b42b-3d68d0a4ef01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:50:33 compute-1 nova_compute[183083]: 2026-01-26 08:50:33.811 183087 DEBUG oslo_concurrency.lockutils [req-6ef68dce-1378-4143-b4bc-2b56935256aa req-997e2ef1-9905-4056-b8ab-f8ea9146bce7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:50:33 compute-1 nova_compute[183083]: 2026-01-26 08:50:33.968 183087 INFO nova.compute.manager [None req-a3ad1539-00a8-4330-abce-52f78fd48254 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Get console output
Jan 26 08:50:33 compute-1 nova_compute[183083]: 2026-01-26 08:50:33.973 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:50:34 compute-1 nova_compute[183083]: 2026-01-26 08:50:34.121 183087 DEBUG nova.network.neutron [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:50:34 compute-1 nova_compute[183083]: 2026-01-26 08:50:34.137 183087 INFO nova.compute.manager [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] [instance: 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6] Took 2.47 seconds to deallocate network for instance.
Jan 26 08:50:34 compute-1 nova_compute[183083]: 2026-01-26 08:50:34.314 183087 INFO nova.scheduler.client.report [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Deleted allocations for instance 311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6
Jan 26 08:50:34 compute-1 nova_compute[183083]: 2026-01-26 08:50:34.315 183087 DEBUG oslo_concurrency.lockutils [None req-832dc688-1513-4ddb-8872-1d0ee8f81193 48b2be9734a34b56adf088cd89958861 76aa13cf72c74a0a94d4c07195fc44ac - - default default] Lock "311ef0ba-1e4b-4ac2-b607-5b2bdaa468a6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:34 compute-1 nova_compute[183083]: 2026-01-26 08:50:34.994 183087 DEBUG oslo_concurrency.lockutils [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "b78b8713-97b3-4628-bfbc-a1069866c0b1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:34 compute-1 nova_compute[183083]: 2026-01-26 08:50:34.994 183087 DEBUG oslo_concurrency.lockutils [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "b78b8713-97b3-4628-bfbc-a1069866c0b1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:34 compute-1 nova_compute[183083]: 2026-01-26 08:50:34.995 183087 DEBUG oslo_concurrency.lockutils [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "b78b8713-97b3-4628-bfbc-a1069866c0b1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:34 compute-1 nova_compute[183083]: 2026-01-26 08:50:34.995 183087 DEBUG oslo_concurrency.lockutils [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "b78b8713-97b3-4628-bfbc-a1069866c0b1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:34 compute-1 nova_compute[183083]: 2026-01-26 08:50:34.996 183087 DEBUG oslo_concurrency.lockutils [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "b78b8713-97b3-4628-bfbc-a1069866c0b1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:34 compute-1 nova_compute[183083]: 2026-01-26 08:50:34.998 183087 INFO nova.compute.manager [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Terminating instance
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:34.999 183087 DEBUG nova.compute.manager [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:50:35 compute-1 kernel: tapa23d1307-6e (unregistering): left promiscuous mode
Jan 26 08:50:35 compute-1 NetworkManager[55451]: <info>  [1769417435.0327] device (tapa23d1307-6e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 08:50:35 compute-1 ovn_controller[95352]: 2026-01-26T08:50:35Z|00181|binding|INFO|Releasing lport a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 from this chassis (sb_readonly=0)
Jan 26 08:50:35 compute-1 ovn_controller[95352]: 2026-01-26T08:50:35Z|00182|binding|INFO|Setting lport a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 down in Southbound
Jan 26 08:50:35 compute-1 ovn_controller[95352]: 2026-01-26T08:50:35Z|00183|binding|INFO|Removing iface tapa23d1307-6e ovn-installed in OVS
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.044 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:35.055 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:36:b4 10.100.0.12'], port_security=['fa:16:3e:d0:36:b4 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'b78b8713-97b3-4628-bfbc-a1069866c0b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2dbe868-e71c-4f41-b88e-0fe68163ca46', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97b37269-bb63-498a-80e2-0f1154aea97c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=a23d1307-6eca-4c0d-8eb1-a3e2095f0af2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:50:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:35.058 104632 INFO neutron.agent.ovn.metadata.agent [-] Port a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 in datapath ce3cd186-bdaf-40d4-a276-e9139fe3dfec unbound from our chassis
Jan 26 08:50:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:35.061 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ce3cd186-bdaf-40d4-a276-e9139fe3dfec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 08:50:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:35.063 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[e674b826-b287-44ad-a78d-69853a63287c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:35.064 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec namespace which is not needed anymore
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.073 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:35 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000025.scope: Deactivated successfully.
Jan 26 08:50:35 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000025.scope: Consumed 11.064s CPU time.
Jan 26 08:50:35 compute-1 systemd-machined[154360]: Machine qemu-9-instance-00000025 terminated.
Jan 26 08:50:35 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[215942]: [NOTICE]   (215946) : haproxy version is 2.8.14-c23fe91
Jan 26 08:50:35 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[215942]: [NOTICE]   (215946) : path to executable is /usr/sbin/haproxy
Jan 26 08:50:35 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[215942]: [WARNING]  (215946) : Exiting Master process...
Jan 26 08:50:35 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[215942]: [ALERT]    (215946) : Current worker (215948) exited with code 143 (Terminated)
Jan 26 08:50:35 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[215942]: [WARNING]  (215946) : All workers exited. Exiting... (0)
Jan 26 08:50:35 compute-1 systemd[1]: libpod-bd3aef42eab9a6f7134923b619328b44e998fc1892eb75759740b3d0e503199a.scope: Deactivated successfully.
Jan 26 08:50:35 compute-1 podman[216111]: 2026-01-26 08:50:35.264277859 +0000 UTC m=+0.089965854 container died bd3aef42eab9a6f7134923b619328b44e998fc1892eb75759740b3d0e503199a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.264 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:35 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bd3aef42eab9a6f7134923b619328b44e998fc1892eb75759740b3d0e503199a-userdata-shm.mount: Deactivated successfully.
Jan 26 08:50:35 compute-1 systemd[1]: var-lib-containers-storage-overlay-f74617d93f879125201a1be399173b60509b8b39ed122369669ed16773642fa9-merged.mount: Deactivated successfully.
Jan 26 08:50:35 compute-1 podman[216111]: 2026-01-26 08:50:35.300638579 +0000 UTC m=+0.126326564 container cleanup bd3aef42eab9a6f7134923b619328b44e998fc1892eb75759740b3d0e503199a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.312 183087 INFO nova.virt.libvirt.driver [-] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Instance destroyed successfully.
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.314 183087 DEBUG nova.objects.instance [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lazy-loading 'resources' on Instance uuid b78b8713-97b3-4628-bfbc-a1069866c0b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:50:35 compute-1 systemd[1]: libpod-conmon-bd3aef42eab9a6f7134923b619328b44e998fc1892eb75759740b3d0e503199a.scope: Deactivated successfully.
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.330 183087 DEBUG nova.virt.libvirt.vif [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T08:50:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-889574907',display_name='tempest-server-test-889574907',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-889574907',id=37,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDY4V6+HgiJjlBp+sismnH//xIhvaHIyZ883PC4cnliN7XgVilpp1kBxLjUHPWqW57B2esyV5Ub1B2t4CQ7Ool6UJEN5MlISleL6j0D4sAsLrzbr7gNivUWbODLUMAl3Sw==',key_name='tempest-keypair-test-1986375941',keypairs=<?>,launch_index=0,launched_at=2026-01-26T08:50:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5d0c78b7cd584e4a90592d8ea01ce4ad',ramdisk_id='',reservation_id='r-4ev95600',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkDefaultSecGroupTest-1876093813',owner_user_name='tempest-NetworkDefaultSecGroupTest-1876093813-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T08:50:21Z,user_data=None,user_id='988ebc31182f4c94813f94306e399a2d',uuid=b78b8713-97b3-4628-bfbc-a1069866c0b1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "address": "fa:16:3e:d0:36:b4", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa23d1307-6e", "ovs_interfaceid": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.330 183087 DEBUG nova.network.os_vif_util [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converting VIF {"id": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "address": "fa:16:3e:d0:36:b4", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa23d1307-6e", "ovs_interfaceid": "a23d1307-6eca-4c0d-8eb1-a3e2095f0af2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.331 183087 DEBUG nova.network.os_vif_util [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:36:b4,bridge_name='br-int',has_traffic_filtering=True,id=a23d1307-6eca-4c0d-8eb1-a3e2095f0af2,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa23d1307-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.331 183087 DEBUG os_vif [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:36:b4,bridge_name='br-int',has_traffic_filtering=True,id=a23d1307-6eca-4c0d-8eb1-a3e2095f0af2,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa23d1307-6e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.333 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.333 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa23d1307-6e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.335 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.337 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.340 183087 INFO os_vif [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:36:b4,bridge_name='br-int',has_traffic_filtering=True,id=a23d1307-6eca-4c0d-8eb1-a3e2095f0af2,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa23d1307-6e')
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.340 183087 INFO nova.virt.libvirt.driver [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Deleting instance files /var/lib/nova/instances/b78b8713-97b3-4628-bfbc-a1069866c0b1_del
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.341 183087 INFO nova.virt.libvirt.driver [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Deletion of /var/lib/nova/instances/b78b8713-97b3-4628-bfbc-a1069866c0b1_del complete
Jan 26 08:50:35 compute-1 podman[216155]: 2026-01-26 08:50:35.372821154 +0000 UTC m=+0.050125017 container remove bd3aef42eab9a6f7134923b619328b44e998fc1892eb75759740b3d0e503199a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 08:50:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:35.377 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[6ec947b8-87ee-4db2-9f03-0f5c94aa78dd]: (4, ('Mon Jan 26 08:50:35 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec (bd3aef42eab9a6f7134923b619328b44e998fc1892eb75759740b3d0e503199a)\nbd3aef42eab9a6f7134923b619328b44e998fc1892eb75759740b3d0e503199a\nMon Jan 26 08:50:35 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec (bd3aef42eab9a6f7134923b619328b44e998fc1892eb75759740b3d0e503199a)\nbd3aef42eab9a6f7134923b619328b44e998fc1892eb75759740b3d0e503199a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:35.378 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[8c028b82-0afc-434e-805c-0b53bf3a4b7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:35.379 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce3cd186-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:50:35 compute-1 kernel: tapce3cd186-b0: left promiscuous mode
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.380 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.395 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:35.396 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[0ad1aa6c-91da-4c90-9c78-85a27fc64c2c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:35.411 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[810cc85d-2001-4b4b-97ba-312e8f2e9f40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:35.412 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[d6d6db4f-ca8a-40e6-945c-eb09dcca2b6d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:35.427 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[5fa16ae1-0621-4107-be61-e7c71e1ffed9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376063, 'reachable_time': 44598, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216171, 'error': None, 'target': 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:35 compute-1 systemd[1]: run-netns-ovnmeta\x2dce3cd186\x2dbdaf\x2d40d4\x2da276\x2de9139fe3dfec.mount: Deactivated successfully.
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.430 183087 INFO nova.compute.manager [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Took 0.43 seconds to destroy the instance on the hypervisor.
Jan 26 08:50:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:35.430 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.431 183087 DEBUG oslo.service.loopingcall [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 08:50:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:35.430 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[942918dc-be9e-4eef-b5bd-045f72d561ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.432 183087 DEBUG nova.compute.manager [-] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.432 183087 DEBUG nova.network.neutron [-] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.622 183087 DEBUG nova.compute.manager [req-62c7763a-cedb-40c7-ae2b-d99376f61604 req-d616a470-116b-4f39-924e-3d79be059a26 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Received event network-vif-unplugged-a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.623 183087 DEBUG oslo_concurrency.lockutils [req-62c7763a-cedb-40c7-ae2b-d99376f61604 req-d616a470-116b-4f39-924e-3d79be059a26 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "b78b8713-97b3-4628-bfbc-a1069866c0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.623 183087 DEBUG oslo_concurrency.lockutils [req-62c7763a-cedb-40c7-ae2b-d99376f61604 req-d616a470-116b-4f39-924e-3d79be059a26 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "b78b8713-97b3-4628-bfbc-a1069866c0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.624 183087 DEBUG oslo_concurrency.lockutils [req-62c7763a-cedb-40c7-ae2b-d99376f61604 req-d616a470-116b-4f39-924e-3d79be059a26 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "b78b8713-97b3-4628-bfbc-a1069866c0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.624 183087 DEBUG nova.compute.manager [req-62c7763a-cedb-40c7-ae2b-d99376f61604 req-d616a470-116b-4f39-924e-3d79be059a26 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] No waiting events found dispatching network-vif-unplugged-a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.624 183087 DEBUG nova.compute.manager [req-62c7763a-cedb-40c7-ae2b-d99376f61604 req-d616a470-116b-4f39-924e-3d79be059a26 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Received event network-vif-unplugged-a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 08:50:35 compute-1 nova_compute[183083]: 2026-01-26 08:50:35.674 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:36 compute-1 nova_compute[183083]: 2026-01-26 08:50:36.255 183087 DEBUG nova.network.neutron [-] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:50:36 compute-1 nova_compute[183083]: 2026-01-26 08:50:36.276 183087 INFO nova.compute.manager [-] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Took 0.84 seconds to deallocate network for instance.
Jan 26 08:50:36 compute-1 nova_compute[183083]: 2026-01-26 08:50:36.328 183087 DEBUG oslo_concurrency.lockutils [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:36 compute-1 nova_compute[183083]: 2026-01-26 08:50:36.329 183087 DEBUG oslo_concurrency.lockutils [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:36 compute-1 nova_compute[183083]: 2026-01-26 08:50:36.378 183087 DEBUG nova.compute.provider_tree [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:50:36 compute-1 nova_compute[183083]: 2026-01-26 08:50:36.410 183087 DEBUG nova.scheduler.client.report [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:50:36 compute-1 nova_compute[183083]: 2026-01-26 08:50:36.435 183087 DEBUG oslo_concurrency.lockutils [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:36 compute-1 nova_compute[183083]: 2026-01-26 08:50:36.469 183087 INFO nova.scheduler.client.report [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Deleted allocations for instance b78b8713-97b3-4628-bfbc-a1069866c0b1
Jan 26 08:50:36 compute-1 nova_compute[183083]: 2026-01-26 08:50:36.551 183087 DEBUG oslo_concurrency.lockutils [None req-5e8fd9e2-0c39-4e15-91ba-50f5420b4c4f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "b78b8713-97b3-4628-bfbc-a1069866c0b1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:37 compute-1 nova_compute[183083]: 2026-01-26 08:50:37.779 183087 DEBUG nova.compute.manager [req-fcfe025d-9bed-4b2b-b17d-f2315ab02749 req-51fdf152-8a8b-45cc-a344-cceb7e721bea 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Received event network-vif-plugged-a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:50:37 compute-1 nova_compute[183083]: 2026-01-26 08:50:37.780 183087 DEBUG oslo_concurrency.lockutils [req-fcfe025d-9bed-4b2b-b17d-f2315ab02749 req-51fdf152-8a8b-45cc-a344-cceb7e721bea 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "b78b8713-97b3-4628-bfbc-a1069866c0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:37 compute-1 nova_compute[183083]: 2026-01-26 08:50:37.780 183087 DEBUG oslo_concurrency.lockutils [req-fcfe025d-9bed-4b2b-b17d-f2315ab02749 req-51fdf152-8a8b-45cc-a344-cceb7e721bea 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "b78b8713-97b3-4628-bfbc-a1069866c0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:37 compute-1 nova_compute[183083]: 2026-01-26 08:50:37.781 183087 DEBUG oslo_concurrency.lockutils [req-fcfe025d-9bed-4b2b-b17d-f2315ab02749 req-51fdf152-8a8b-45cc-a344-cceb7e721bea 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "b78b8713-97b3-4628-bfbc-a1069866c0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:37 compute-1 nova_compute[183083]: 2026-01-26 08:50:37.781 183087 DEBUG nova.compute.manager [req-fcfe025d-9bed-4b2b-b17d-f2315ab02749 req-51fdf152-8a8b-45cc-a344-cceb7e721bea 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] No waiting events found dispatching network-vif-plugged-a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:50:37 compute-1 nova_compute[183083]: 2026-01-26 08:50:37.782 183087 WARNING nova.compute.manager [req-fcfe025d-9bed-4b2b-b17d-f2315ab02749 req-51fdf152-8a8b-45cc-a344-cceb7e721bea 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Received unexpected event network-vif-plugged-a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 for instance with vm_state deleted and task_state None.
Jan 26 08:50:37 compute-1 nova_compute[183083]: 2026-01-26 08:50:37.782 183087 DEBUG nova.compute.manager [req-fcfe025d-9bed-4b2b-b17d-f2315ab02749 req-51fdf152-8a8b-45cc-a344-cceb7e721bea 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Received event network-vif-deleted-a23d1307-6eca-4c0d-8eb1-a3e2095f0af2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:50:40 compute-1 nova_compute[183083]: 2026-01-26 08:50:40.337 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:40 compute-1 nova_compute[183083]: 2026-01-26 08:50:40.706 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:40 compute-1 podman[216172]: 2026-01-26 08:50:40.811233774 +0000 UTC m=+0.069183281 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 08:50:44 compute-1 nova_compute[183083]: 2026-01-26 08:50:44.647 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:44 compute-1 nova_compute[183083]: 2026-01-26 08:50:44.647 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:44 compute-1 nova_compute[183083]: 2026-01-26 08:50:44.671 183087 DEBUG nova.compute.manager [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:50:44 compute-1 nova_compute[183083]: 2026-01-26 08:50:44.793 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:44 compute-1 nova_compute[183083]: 2026-01-26 08:50:44.793 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:44 compute-1 nova_compute[183083]: 2026-01-26 08:50:44.803 183087 DEBUG nova.virt.hardware [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:50:44 compute-1 nova_compute[183083]: 2026-01-26 08:50:44.804 183087 INFO nova.compute.claims [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:50:44 compute-1 nova_compute[183083]: 2026-01-26 08:50:44.918 183087 DEBUG nova.compute.provider_tree [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:50:44 compute-1 nova_compute[183083]: 2026-01-26 08:50:44.932 183087 DEBUG nova.scheduler.client.report [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:50:44 compute-1 nova_compute[183083]: 2026-01-26 08:50:44.958 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:44 compute-1 nova_compute[183083]: 2026-01-26 08:50:44.959 183087 DEBUG nova.compute.manager [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.008 183087 DEBUG nova.compute.manager [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.009 183087 DEBUG nova.network.neutron [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.031 183087 INFO nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.065 183087 DEBUG nova.compute.manager [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.169 183087 DEBUG nova.compute.manager [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.171 183087 DEBUG nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.172 183087 INFO nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Creating image(s)
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.173 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "/var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.173 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "/var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.174 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "/var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.204 183087 DEBUG oslo_concurrency.processutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.291 183087 DEBUG nova.policy [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.298 183087 DEBUG oslo_concurrency.processutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.300 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.301 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.326 183087 DEBUG oslo_concurrency.processutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.389 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.414 183087 DEBUG oslo_concurrency.processutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.415 183087 DEBUG oslo_concurrency.processutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.470 183087 DEBUG oslo_concurrency.processutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.471 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.472 183087 DEBUG oslo_concurrency.processutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.558 183087 DEBUG oslo_concurrency.processutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.560 183087 DEBUG nova.virt.disk.api [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Checking if we can resize image /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.560 183087 DEBUG oslo_concurrency.processutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.614 183087 DEBUG oslo_concurrency.processutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.616 183087 DEBUG nova.virt.disk.api [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Cannot resize image /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.617 183087 DEBUG nova.objects.instance [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lazy-loading 'migration_context' on Instance uuid 70d16f06-3d0a-454f-a1dd-87ce77ed8582 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.643 183087 DEBUG nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.644 183087 DEBUG nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Ensure instance console log exists: /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.645 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.645 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.646 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:45 compute-1 nova_compute[183083]: 2026-01-26 08:50:45.709 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:48 compute-1 nova_compute[183083]: 2026-01-26 08:50:48.071 183087 DEBUG nova.network.neutron [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Successfully created port: fb4faeb1-e827-462b-8f21-36892b978052 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:50:49 compute-1 nova_compute[183083]: 2026-01-26 08:50:49.083 183087 DEBUG nova.network.neutron [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Successfully updated port: fb4faeb1-e827-462b-8f21-36892b978052 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:50:49 compute-1 nova_compute[183083]: 2026-01-26 08:50:49.106 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "refresh_cache-70d16f06-3d0a-454f-a1dd-87ce77ed8582" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:50:49 compute-1 nova_compute[183083]: 2026-01-26 08:50:49.106 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquired lock "refresh_cache-70d16f06-3d0a-454f-a1dd-87ce77ed8582" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:50:49 compute-1 nova_compute[183083]: 2026-01-26 08:50:49.107 183087 DEBUG nova.network.neutron [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:50:49 compute-1 nova_compute[183083]: 2026-01-26 08:50:49.373 183087 DEBUG nova.compute.manager [req-09f64d1d-80d4-40c7-a0bc-f12f87868921 req-be730314-15b2-404d-ba32-3a3ae540eefe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Received event network-changed-fb4faeb1-e827-462b-8f21-36892b978052 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:50:49 compute-1 nova_compute[183083]: 2026-01-26 08:50:49.374 183087 DEBUG nova.compute.manager [req-09f64d1d-80d4-40c7-a0bc-f12f87868921 req-be730314-15b2-404d-ba32-3a3ae540eefe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Refreshing instance network info cache due to event network-changed-fb4faeb1-e827-462b-8f21-36892b978052. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:50:49 compute-1 nova_compute[183083]: 2026-01-26 08:50:49.375 183087 DEBUG oslo_concurrency.lockutils [req-09f64d1d-80d4-40c7-a0bc-f12f87868921 req-be730314-15b2-404d-ba32-3a3ae540eefe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-70d16f06-3d0a-454f-a1dd-87ce77ed8582" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:50:49 compute-1 nova_compute[183083]: 2026-01-26 08:50:49.429 183087 DEBUG nova.network.neutron [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:50:50 compute-1 nova_compute[183083]: 2026-01-26 08:50:50.310 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769417435.3102286, b78b8713-97b3-4628-bfbc-a1069866c0b1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:50:50 compute-1 nova_compute[183083]: 2026-01-26 08:50:50.311 183087 INFO nova.compute.manager [-] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] VM Stopped (Lifecycle Event)
Jan 26 08:50:50 compute-1 sshd-session[216211]: Connection closed by authenticating user root 159.223.236.81 port 40574 [preauth]
Jan 26 08:50:50 compute-1 nova_compute[183083]: 2026-01-26 08:50:50.371 183087 DEBUG nova.compute.manager [None req-645a76a2-d72b-400d-ae6b-df220d8cb642 - - - - - -] [instance: b78b8713-97b3-4628-bfbc-a1069866c0b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:50:50 compute-1 nova_compute[183083]: 2026-01-26 08:50:50.391 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:50 compute-1 nova_compute[183083]: 2026-01-26 08:50:50.711 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.221 183087 DEBUG nova.network.neutron [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Updating instance_info_cache with network_info: [{"id": "fb4faeb1-e827-462b-8f21-36892b978052", "address": "fa:16:3e:c5:58:7b", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb4faeb1-e8", "ovs_interfaceid": "fb4faeb1-e827-462b-8f21-36892b978052", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.253 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Releasing lock "refresh_cache-70d16f06-3d0a-454f-a1dd-87ce77ed8582" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.254 183087 DEBUG nova.compute.manager [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Instance network_info: |[{"id": "fb4faeb1-e827-462b-8f21-36892b978052", "address": "fa:16:3e:c5:58:7b", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb4faeb1-e8", "ovs_interfaceid": "fb4faeb1-e827-462b-8f21-36892b978052", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.254 183087 DEBUG oslo_concurrency.lockutils [req-09f64d1d-80d4-40c7-a0bc-f12f87868921 req-be730314-15b2-404d-ba32-3a3ae540eefe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-70d16f06-3d0a-454f-a1dd-87ce77ed8582" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.254 183087 DEBUG nova.network.neutron [req-09f64d1d-80d4-40c7-a0bc-f12f87868921 req-be730314-15b2-404d-ba32-3a3ae540eefe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Refreshing network info cache for port fb4faeb1-e827-462b-8f21-36892b978052 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.257 183087 DEBUG nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Start _get_guest_xml network_info=[{"id": "fb4faeb1-e827-462b-8f21-36892b978052", "address": "fa:16:3e:c5:58:7b", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb4faeb1-e8", "ovs_interfaceid": "fb4faeb1-e827-462b-8f21-36892b978052", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encryption_options': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.268 183087 WARNING nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.283 183087 DEBUG nova.virt.libvirt.host [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.284 183087 DEBUG nova.virt.libvirt.host [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.288 183087 DEBUG nova.virt.libvirt.host [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.289 183087 DEBUG nova.virt.libvirt.host [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.290 183087 DEBUG nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.290 183087 DEBUG nova.virt.hardware [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T08:43:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7017323-cd35-470d-a718-faa6c6e97277',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.291 183087 DEBUG nova.virt.hardware [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.291 183087 DEBUG nova.virt.hardware [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.291 183087 DEBUG nova.virt.hardware [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.291 183087 DEBUG nova.virt.hardware [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.291 183087 DEBUG nova.virt.hardware [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.292 183087 DEBUG nova.virt.hardware [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.292 183087 DEBUG nova.virt.hardware [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.292 183087 DEBUG nova.virt.hardware [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.292 183087 DEBUG nova.virt.hardware [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.292 183087 DEBUG nova.virt.hardware [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.299 183087 DEBUG nova.virt.libvirt.vif [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:50:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-2078215968',display_name='tempest-server-test-2078215968',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-2078215968',id=39,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDY4V6+HgiJjlBp+sismnH//xIhvaHIyZ883PC4cnliN7XgVilpp1kBxLjUHPWqW57B2esyV5Ub1B2t4CQ7Ool6UJEN5MlISleL6j0D4sAsLrzbr7gNivUWbODLUMAl3Sw==',key_name='tempest-keypair-test-1986375941',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5d0c78b7cd584e4a90592d8ea01ce4ad',ramdisk_id='',reservation_id='r-o7dkh8qr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkDefaultSecGroupTest-1876093813',owner_user_name='tempest-NetworkDefaultSecGroupTest-1876093813-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:50:45Z,user_data=None,user_id='988ebc31182f4c94813f94306e399a2d',uuid=70d16f06-3d0a-454f-a1dd-87ce77ed8582,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fb4faeb1-e827-462b-8f21-36892b978052", "address": "fa:16:3e:c5:58:7b", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb4faeb1-e8", "ovs_interfaceid": "fb4faeb1-e827-462b-8f21-36892b978052", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.300 183087 DEBUG nova.network.os_vif_util [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converting VIF {"id": "fb4faeb1-e827-462b-8f21-36892b978052", "address": "fa:16:3e:c5:58:7b", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb4faeb1-e8", "ovs_interfaceid": "fb4faeb1-e827-462b-8f21-36892b978052", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.301 183087 DEBUG nova.network.os_vif_util [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:58:7b,bridge_name='br-int',has_traffic_filtering=True,id=fb4faeb1-e827-462b-8f21-36892b978052,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb4faeb1-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.302 183087 DEBUG nova.objects.instance [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lazy-loading 'pci_devices' on Instance uuid 70d16f06-3d0a-454f-a1dd-87ce77ed8582 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.553 183087 DEBUG nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] End _get_guest_xml xml=<domain type="kvm">
Jan 26 08:50:52 compute-1 nova_compute[183083]:   <uuid>70d16f06-3d0a-454f-a1dd-87ce77ed8582</uuid>
Jan 26 08:50:52 compute-1 nova_compute[183083]:   <name>instance-00000027</name>
Jan 26 08:50:52 compute-1 nova_compute[183083]:   <memory>131072</memory>
Jan 26 08:50:52 compute-1 nova_compute[183083]:   <vcpu>1</vcpu>
Jan 26 08:50:52 compute-1 nova_compute[183083]:   <metadata>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <nova:name>tempest-server-test-2078215968</nova:name>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <nova:creationTime>2026-01-26 08:50:52</nova:creationTime>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <nova:flavor name="m1.nano">
Jan 26 08:50:52 compute-1 nova_compute[183083]:         <nova:memory>128</nova:memory>
Jan 26 08:50:52 compute-1 nova_compute[183083]:         <nova:disk>1</nova:disk>
Jan 26 08:50:52 compute-1 nova_compute[183083]:         <nova:swap>0</nova:swap>
Jan 26 08:50:52 compute-1 nova_compute[183083]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 08:50:52 compute-1 nova_compute[183083]:         <nova:vcpus>1</nova:vcpus>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       </nova:flavor>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <nova:owner>
Jan 26 08:50:52 compute-1 nova_compute[183083]:         <nova:user uuid="988ebc31182f4c94813f94306e399a2d">tempest-NetworkDefaultSecGroupTest-1876093813-project-member</nova:user>
Jan 26 08:50:52 compute-1 nova_compute[183083]:         <nova:project uuid="5d0c78b7cd584e4a90592d8ea01ce4ad">tempest-NetworkDefaultSecGroupTest-1876093813</nova:project>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       </nova:owner>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <nova:root type="image" uuid="13d1a20a-8003-4f19-aba7-ccbd9eff9b82"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <nova:ports>
Jan 26 08:50:52 compute-1 nova_compute[183083]:         <nova:port uuid="fb4faeb1-e827-462b-8f21-36892b978052">
Jan 26 08:50:52 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       </nova:ports>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     </nova:instance>
Jan 26 08:50:52 compute-1 nova_compute[183083]:   </metadata>
Jan 26 08:50:52 compute-1 nova_compute[183083]:   <sysinfo type="smbios">
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <system>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <entry name="manufacturer">RDO</entry>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <entry name="product">OpenStack Compute</entry>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <entry name="serial">70d16f06-3d0a-454f-a1dd-87ce77ed8582</entry>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <entry name="uuid">70d16f06-3d0a-454f-a1dd-87ce77ed8582</entry>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <entry name="family">Virtual Machine</entry>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     </system>
Jan 26 08:50:52 compute-1 nova_compute[183083]:   </sysinfo>
Jan 26 08:50:52 compute-1 nova_compute[183083]:   <os>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <boot dev="hd"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <smbios mode="sysinfo"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:   </os>
Jan 26 08:50:52 compute-1 nova_compute[183083]:   <features>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <acpi/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <apic/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <vmcoreinfo/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:   </features>
Jan 26 08:50:52 compute-1 nova_compute[183083]:   <clock offset="utc">
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <timer name="hpet" present="no"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:   </clock>
Jan 26 08:50:52 compute-1 nova_compute[183083]:   <cpu mode="host-model" match="exact">
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:   </cpu>
Jan 26 08:50:52 compute-1 nova_compute[183083]:   <devices>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <disk type="file" device="disk">
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <target dev="vda" bus="virtio"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <disk type="file" device="cdrom">
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.config"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <target dev="sda" bus="sata"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:c5:58:7b"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <target dev="tapfb4faeb1-e8"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <serial type="pty">
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <log file="/var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/console.log" append="off"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     </serial>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <video>
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     </video>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <input type="tablet" bus="usb"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <rng model="virtio">
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <backend model="random">/dev/urandom</backend>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     </rng>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <controller type="usb" index="0"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     <memballoon model="virtio">
Jan 26 08:50:52 compute-1 nova_compute[183083]:       <stats period="10"/>
Jan 26 08:50:52 compute-1 nova_compute[183083]:     </memballoon>
Jan 26 08:50:52 compute-1 nova_compute[183083]:   </devices>
Jan 26 08:50:52 compute-1 nova_compute[183083]: </domain>
Jan 26 08:50:52 compute-1 nova_compute[183083]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.555 183087 DEBUG nova.compute.manager [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Preparing to wait for external event network-vif-plugged-fb4faeb1-e827-462b-8f21-36892b978052 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.556 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.556 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.557 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.558 183087 DEBUG nova.virt.libvirt.vif [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:50:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-2078215968',display_name='tempest-server-test-2078215968',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-2078215968',id=39,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDY4V6+HgiJjlBp+sismnH//xIhvaHIyZ883PC4cnliN7XgVilpp1kBxLjUHPWqW57B2esyV5Ub1B2t4CQ7Ool6UJEN5MlISleL6j0D4sAsLrzbr7gNivUWbODLUMAl3Sw==',key_name='tempest-keypair-test-1986375941',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5d0c78b7cd584e4a90592d8ea01ce4ad',ramdisk_id='',reservation_id='r-o7dkh8qr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkDefaultSecGroupTest-1876093813',owner_user_name='tempest-NetworkDefaultSecGroupTest-1876093813-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:50:45Z,user_data=None,user_id='988ebc31182f4c94813f94306e399a2d',uuid=70d16f06-3d0a-454f-a1dd-87ce77ed8582,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fb4faeb1-e827-462b-8f21-36892b978052", "address": "fa:16:3e:c5:58:7b", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb4faeb1-e8", "ovs_interfaceid": "fb4faeb1-e827-462b-8f21-36892b978052", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.559 183087 DEBUG nova.network.os_vif_util [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converting VIF {"id": "fb4faeb1-e827-462b-8f21-36892b978052", "address": "fa:16:3e:c5:58:7b", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb4faeb1-e8", "ovs_interfaceid": "fb4faeb1-e827-462b-8f21-36892b978052", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.560 183087 DEBUG nova.network.os_vif_util [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:58:7b,bridge_name='br-int',has_traffic_filtering=True,id=fb4faeb1-e827-462b-8f21-36892b978052,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb4faeb1-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.560 183087 DEBUG os_vif [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:58:7b,bridge_name='br-int',has_traffic_filtering=True,id=fb4faeb1-e827-462b-8f21-36892b978052,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb4faeb1-e8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.561 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.562 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.563 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:50:52 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:52.564 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.565 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:52 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:52.567 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.567 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.567 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfb4faeb1-e8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.567 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfb4faeb1-e8, col_values=(('external_ids', {'iface-id': 'fb4faeb1-e827-462b-8f21-36892b978052', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c5:58:7b', 'vm-uuid': '70d16f06-3d0a-454f-a1dd-87ce77ed8582'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.569 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.571 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:50:52 compute-1 NetworkManager[55451]: <info>  [1769417452.5704] manager: (tapfb4faeb1-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.578 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.579 183087 INFO os_vif [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:58:7b,bridge_name='br-int',has_traffic_filtering=True,id=fb4faeb1-e827-462b-8f21-36892b978052,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb4faeb1-e8')
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.636 183087 DEBUG nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.637 183087 DEBUG nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.637 183087 DEBUG nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] No VIF found with MAC fa:16:3e:c5:58:7b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 08:50:52 compute-1 nova_compute[183083]: 2026-01-26 08:50:52.637 183087 INFO nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Using config drive
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.173 183087 INFO nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Creating config drive at /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.config
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.179 183087 DEBUG oslo_concurrency.processutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprp9i31my execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.306 183087 DEBUG oslo_concurrency.processutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprp9i31my" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:50:53 compute-1 kernel: tapfb4faeb1-e8: entered promiscuous mode
Jan 26 08:50:53 compute-1 NetworkManager[55451]: <info>  [1769417453.3830] manager: (tapfb4faeb1-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Jan 26 08:50:53 compute-1 ovn_controller[95352]: 2026-01-26T08:50:53Z|00184|binding|INFO|Claiming lport fb4faeb1-e827-462b-8f21-36892b978052 for this chassis.
Jan 26 08:50:53 compute-1 ovn_controller[95352]: 2026-01-26T08:50:53Z|00185|binding|INFO|fb4faeb1-e827-462b-8f21-36892b978052: Claiming fa:16:3e:c5:58:7b 10.100.0.6
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.385 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.402 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:58:7b 10.100.0.6'], port_security=['fa:16:3e:c5:58:7b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cf9da6cb-556c-4ee2-9313-239643d15cb1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97b37269-bb63-498a-80e2-0f1154aea97c, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=fb4faeb1-e827-462b-8f21-36892b978052) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.404 104632 INFO neutron.agent.ovn.metadata.agent [-] Port fb4faeb1-e827-462b-8f21-36892b978052 in datapath ce3cd186-bdaf-40d4-a276-e9139fe3dfec bound to our chassis
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.406 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ce3cd186-bdaf-40d4-a276-e9139fe3dfec
Jan 26 08:50:53 compute-1 ovn_controller[95352]: 2026-01-26T08:50:53Z|00186|binding|INFO|Setting lport fb4faeb1-e827-462b-8f21-36892b978052 up in Southbound
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.410 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:53 compute-1 ovn_controller[95352]: 2026-01-26T08:50:53Z|00187|binding|INFO|Setting lport fb4faeb1-e827-462b-8f21-36892b978052 ovn-installed in OVS
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.411 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.416 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.421 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[6cf29983-99a6-45c0-81ec-26fb407efe2f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.422 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapce3cd186-b1 in ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.425 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapce3cd186-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.425 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b0213d62-13ce-49e3-be29-6fb866fb426a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.426 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[08c22c86-512f-474c-8b36-5e6d79095d56]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:53 compute-1 systemd-udevd[216233]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.444 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[b18b51b4-174a-421d-b01a-1899c2d277ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:53 compute-1 systemd-machined[154360]: New machine qemu-10-instance-00000027.
Jan 26 08:50:53 compute-1 NetworkManager[55451]: <info>  [1769417453.4531] device (tapfb4faeb1-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 08:50:53 compute-1 NetworkManager[55451]: <info>  [1769417453.4538] device (tapfb4faeb1-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.463 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[be964ff6-6230-4dda-9a6b-863a9065bb11]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:53 compute-1 systemd[1]: Started Virtual Machine qemu-10-instance-00000027.
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.502 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[b0e96f6d-a6bd-4a4f-bd7a-d32ca2ae6826]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:53 compute-1 NetworkManager[55451]: <info>  [1769417453.5111] manager: (tapce3cd186-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/69)
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.510 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[31a716a3-4dcb-4db6-99ec-5df902357aca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.548 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[429a72e0-5714-4ac8-a433-40a360d712f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.552 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[12c4727a-af88-41b7-93e6-42b48ab2fcbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:53 compute-1 NetworkManager[55451]: <info>  [1769417453.5789] device (tapce3cd186-b0): carrier: link connected
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.588 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[ce19d460-5bd5-4d77-a6e8-f2da51471042]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.617 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[4f60fbb1-ca02-4642-bd5c-424c4780423a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce3cd186-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:90:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379418, 'reachable_time': 30482, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216266, 'error': None, 'target': 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.639 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[da8bf3d5-f3e3-493a-92cc-146ea190a275]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:9011'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379418, 'tstamp': 379418}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216269, 'error': None, 'target': 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.666 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[2e146c60-569b-45ae-af0d-ee12e5a85005]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce3cd186-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:90:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379418, 'reachable_time': 30482, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216274, 'error': None, 'target': 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.676 183087 DEBUG nova.compute.manager [req-0acfb900-b1a2-45d0-b5e4-2d86453f483e req-96643853-f169-443c-a6a1-9efcfca48035 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Received event network-vif-plugged-fb4faeb1-e827-462b-8f21-36892b978052 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.676 183087 DEBUG oslo_concurrency.lockutils [req-0acfb900-b1a2-45d0-b5e4-2d86453f483e req-96643853-f169-443c-a6a1-9efcfca48035 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.676 183087 DEBUG oslo_concurrency.lockutils [req-0acfb900-b1a2-45d0-b5e4-2d86453f483e req-96643853-f169-443c-a6a1-9efcfca48035 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.677 183087 DEBUG oslo_concurrency.lockutils [req-0acfb900-b1a2-45d0-b5e4-2d86453f483e req-96643853-f169-443c-a6a1-9efcfca48035 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.677 183087 DEBUG nova.compute.manager [req-0acfb900-b1a2-45d0-b5e4-2d86453f483e req-96643853-f169-443c-a6a1-9efcfca48035 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Processing event network-vif-plugged-fb4faeb1-e827-462b-8f21-36892b978052 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.706 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[45464cd7-ed21-4bb7-a620-fc278cd9107c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.730 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417453.7302313, 70d16f06-3d0a-454f-a1dd-87ce77ed8582 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.731 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] VM Started (Lifecycle Event)
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.735 183087 DEBUG nova.compute.manager [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.739 183087 DEBUG nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.745 183087 INFO nova.virt.libvirt.driver [-] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Instance spawned successfully.
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.746 183087 DEBUG nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.766 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.770 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.780 183087 DEBUG nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.780 183087 DEBUG nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.781 183087 DEBUG nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.781 183087 DEBUG nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.781 183087 DEBUG nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.782 183087 DEBUG nova.virt.libvirt.driver [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.791 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.792 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417453.731298, 70d16f06-3d0a-454f-a1dd-87ce77ed8582 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.792 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] VM Paused (Lifecycle Event)
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.794 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[e7aba8a7-e8f2-4c82-b289-4408e7a2b8bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.795 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce3cd186-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.795 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.796 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce3cd186-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:50:53 compute-1 NetworkManager[55451]: <info>  [1769417453.8308] manager: (tapce3cd186-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Jan 26 08:50:53 compute-1 kernel: tapce3cd186-b0: entered promiscuous mode
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.834 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.834 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.835 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapce3cd186-b0, col_values=(('external_ids', {'iface-id': '597f7eff-a379-4a5d-bffc-0e294ae89e61'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:50:53 compute-1 ovn_controller[95352]: 2026-01-26T08:50:53Z|00188|binding|INFO|Releasing lport 597f7eff-a379-4a5d-bffc-0e294ae89e61 from this chassis (sb_readonly=0)
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.841 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417453.7384806, 70d16f06-3d0a-454f-a1dd-87ce77ed8582 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.841 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] VM Resumed (Lifecycle Event)
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.854 183087 INFO nova.compute.manager [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Took 8.68 seconds to spawn the instance on the hypervisor.
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.855 183087 DEBUG nova.compute.manager [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.856 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.857 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ce3cd186-bdaf-40d4-a276-e9139fe3dfec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ce3cd186-bdaf-40d4-a276-e9139fe3dfec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.858 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[653e0b9a-8811-40a8-bc7c-da142221bc73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.859 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: global
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-ce3cd186-bdaf-40d4-a276-e9139fe3dfec
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/ce3cd186-bdaf-40d4-a276-e9139fe3dfec.pid.haproxy
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID ce3cd186-bdaf-40d4-a276-e9139fe3dfec
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 08:50:53 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:53.859 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'env', 'PROCESS_TAG=haproxy-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ce3cd186-bdaf-40d4-a276-e9139fe3dfec.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.864 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.867 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.920 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.939 183087 INFO nova.compute.manager [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Took 9.17 seconds to build instance.
Jan 26 08:50:53 compute-1 nova_compute[183083]: 2026-01-26 08:50:53.962 183087 DEBUG oslo_concurrency.lockutils [None req-baba6084-88f5-4162-b5d8-83d0f52e12b4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.315s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:54 compute-1 nova_compute[183083]: 2026-01-26 08:50:54.225 183087 DEBUG nova.network.neutron [req-09f64d1d-80d4-40c7-a0bc-f12f87868921 req-be730314-15b2-404d-ba32-3a3ae540eefe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Updated VIF entry in instance network info cache for port fb4faeb1-e827-462b-8f21-36892b978052. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:50:54 compute-1 nova_compute[183083]: 2026-01-26 08:50:54.225 183087 DEBUG nova.network.neutron [req-09f64d1d-80d4-40c7-a0bc-f12f87868921 req-be730314-15b2-404d-ba32-3a3ae540eefe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Updating instance_info_cache with network_info: [{"id": "fb4faeb1-e827-462b-8f21-36892b978052", "address": "fa:16:3e:c5:58:7b", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb4faeb1-e8", "ovs_interfaceid": "fb4faeb1-e827-462b-8f21-36892b978052", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:50:54 compute-1 nova_compute[183083]: 2026-01-26 08:50:54.243 183087 DEBUG oslo_concurrency.lockutils [req-09f64d1d-80d4-40c7-a0bc-f12f87868921 req-be730314-15b2-404d-ba32-3a3ae540eefe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-70d16f06-3d0a-454f-a1dd-87ce77ed8582" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:50:54 compute-1 podman[216305]: 2026-01-26 08:50:54.309567181 +0000 UTC m=+0.088935435 container create 2adedd02399f28e01ae990c221c197c77873a6fca76921bc936e6a053ea62ffe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:50:54 compute-1 podman[216305]: 2026-01-26 08:50:54.266937605 +0000 UTC m=+0.046305909 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 08:50:54 compute-1 systemd[1]: Started libpod-conmon-2adedd02399f28e01ae990c221c197c77873a6fca76921bc936e6a053ea62ffe.scope.
Jan 26 08:50:54 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:50:54 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b18b92d931024c0e41f28df4d171744df1d0481ca524aacd5261335c74f4e9d7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 08:50:54 compute-1 podman[216305]: 2026-01-26 08:50:54.421287025 +0000 UTC m=+0.200655269 container init 2adedd02399f28e01ae990c221c197c77873a6fca76921bc936e6a053ea62ffe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 26 08:50:54 compute-1 podman[216305]: 2026-01-26 08:50:54.435259077 +0000 UTC m=+0.214627291 container start 2adedd02399f28e01ae990c221c197c77873a6fca76921bc936e6a053ea62ffe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:50:54 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[216320]: [NOTICE]   (216324) : New worker (216326) forked
Jan 26 08:50:54 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[216320]: [NOTICE]   (216324) : Loading success.
Jan 26 08:50:55 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:50:55.569 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:50:55 compute-1 nova_compute[183083]: 2026-01-26 08:50:55.713 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:50:55 compute-1 nova_compute[183083]: 2026-01-26 08:50:55.769 183087 DEBUG nova.compute.manager [req-3177aa12-797e-4c99-b526-94c95d1dc861 req-5dfadd54-3156-4967-8728-3bbe7194d300 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Received event network-vif-plugged-fb4faeb1-e827-462b-8f21-36892b978052 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:50:55 compute-1 nova_compute[183083]: 2026-01-26 08:50:55.769 183087 DEBUG oslo_concurrency.lockutils [req-3177aa12-797e-4c99-b526-94c95d1dc861 req-5dfadd54-3156-4967-8728-3bbe7194d300 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:50:55 compute-1 nova_compute[183083]: 2026-01-26 08:50:55.770 183087 DEBUG oslo_concurrency.lockutils [req-3177aa12-797e-4c99-b526-94c95d1dc861 req-5dfadd54-3156-4967-8728-3bbe7194d300 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:50:55 compute-1 nova_compute[183083]: 2026-01-26 08:50:55.770 183087 DEBUG oslo_concurrency.lockutils [req-3177aa12-797e-4c99-b526-94c95d1dc861 req-5dfadd54-3156-4967-8728-3bbe7194d300 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:50:55 compute-1 nova_compute[183083]: 2026-01-26 08:50:55.770 183087 DEBUG nova.compute.manager [req-3177aa12-797e-4c99-b526-94c95d1dc861 req-5dfadd54-3156-4967-8728-3bbe7194d300 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] No waiting events found dispatching network-vif-plugged-fb4faeb1-e827-462b-8f21-36892b978052 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:50:55 compute-1 nova_compute[183083]: 2026-01-26 08:50:55.770 183087 WARNING nova.compute.manager [req-3177aa12-797e-4c99-b526-94c95d1dc861 req-5dfadd54-3156-4967-8728-3bbe7194d300 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Received unexpected event network-vif-plugged-fb4faeb1-e827-462b-8f21-36892b978052 for instance with vm_state active and task_state None.
Jan 26 08:50:55 compute-1 podman[216336]: 2026-01-26 08:50:55.82580649 +0000 UTC m=+0.081576549 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=openstack_network_exporter, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 26 08:50:55 compute-1 podman[216335]: 2026-01-26 08:50:55.844479474 +0000 UTC m=+0.100766128 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 26 08:50:56 compute-1 nova_compute[183083]: 2026-01-26 08:50:56.446 183087 INFO nova.compute.manager [None req-aec031d4-466a-4ad0-bb9f-421b1c2713eb 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Get console output
Jan 26 08:50:56 compute-1 nova_compute[183083]: 2026-01-26 08:50:56.454 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:50:57 compute-1 nova_compute[183083]: 2026-01-26 08:50:57.571 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:00 compute-1 nova_compute[183083]: 2026-01-26 08:51:00.715 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:00 compute-1 nova_compute[183083]: 2026-01-26 08:51:00.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:51:01 compute-1 nova_compute[183083]: 2026-01-26 08:51:01.599 183087 INFO nova.compute.manager [None req-31025034-6ffc-49c4-a18c-747bedd118ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Get console output
Jan 26 08:51:01 compute-1 podman[216372]: 2026-01-26 08:51:01.817346993 +0000 UTC m=+0.076963810 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 26 08:51:01 compute-1 podman[216373]: 2026-01-26 08:51:01.82650747 +0000 UTC m=+0.081656541 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 08:51:01 compute-1 podman[216371]: 2026-01-26 08:51:01.851926643 +0000 UTC m=+0.115163921 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 08:51:01 compute-1 nova_compute[183083]: 2026-01-26 08:51:01.946 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:51:01 compute-1 nova_compute[183083]: 2026-01-26 08:51:01.969 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:51:02 compute-1 nova_compute[183083]: 2026-01-26 08:51:02.629 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:02 compute-1 nova_compute[183083]: 2026-01-26 08:51:02.958 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:51:02 compute-1 nova_compute[183083]: 2026-01-26 08:51:02.959 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:51:02 compute-1 nova_compute[183083]: 2026-01-26 08:51:02.959 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:51:02 compute-1 nova_compute[183083]: 2026-01-26 08:51:02.959 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:51:03 compute-1 nova_compute[183083]: 2026-01-26 08:51:03.160 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "refresh_cache-70d16f06-3d0a-454f-a1dd-87ce77ed8582" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:51:03 compute-1 nova_compute[183083]: 2026-01-26 08:51:03.161 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquired lock "refresh_cache-70d16f06-3d0a-454f-a1dd-87ce77ed8582" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:51:03 compute-1 nova_compute[183083]: 2026-01-26 08:51:03.161 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 08:51:03 compute-1 nova_compute[183083]: 2026-01-26 08:51:03.162 183087 DEBUG nova.objects.instance [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 70d16f06-3d0a-454f-a1dd-87ce77ed8582 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:51:04 compute-1 nova_compute[183083]: 2026-01-26 08:51:04.037 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Updating instance_info_cache with network_info: [{"id": "fb4faeb1-e827-462b-8f21-36892b978052", "address": "fa:16:3e:c5:58:7b", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb4faeb1-e8", "ovs_interfaceid": "fb4faeb1-e827-462b-8f21-36892b978052", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:51:04 compute-1 nova_compute[183083]: 2026-01-26 08:51:04.057 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Releasing lock "refresh_cache-70d16f06-3d0a-454f-a1dd-87ce77ed8582" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:51:04 compute-1 nova_compute[183083]: 2026-01-26 08:51:04.057 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 08:51:04 compute-1 nova_compute[183083]: 2026-01-26 08:51:04.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:51:04 compute-1 nova_compute[183083]: 2026-01-26 08:51:04.953 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:51:04 compute-1 nova_compute[183083]: 2026-01-26 08:51:04.980 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:51:04 compute-1 nova_compute[183083]: 2026-01-26 08:51:04.981 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:51:04 compute-1 nova_compute[183083]: 2026-01-26 08:51:04.981 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:51:04 compute-1 nova_compute[183083]: 2026-01-26 08:51:04.982 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:51:05 compute-1 nova_compute[183083]: 2026-01-26 08:51:05.066 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:51:05 compute-1 nova_compute[183083]: 2026-01-26 08:51:05.159 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:51:05 compute-1 nova_compute[183083]: 2026-01-26 08:51:05.161 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:51:05 compute-1 nova_compute[183083]: 2026-01-26 08:51:05.216 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:51:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:51:05.303 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:51:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:51:05.303 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:51:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:51:05.304 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:51:05 compute-1 nova_compute[183083]: 2026-01-26 08:51:05.366 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:51:05 compute-1 nova_compute[183083]: 2026-01-26 08:51:05.368 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13563MB free_disk=113.06998062133789GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:51:05 compute-1 nova_compute[183083]: 2026-01-26 08:51:05.368 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:51:05 compute-1 nova_compute[183083]: 2026-01-26 08:51:05.368 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:51:05 compute-1 nova_compute[183083]: 2026-01-26 08:51:05.431 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance 70d16f06-3d0a-454f-a1dd-87ce77ed8582 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 08:51:05 compute-1 nova_compute[183083]: 2026-01-26 08:51:05.432 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:51:05 compute-1 nova_compute[183083]: 2026-01-26 08:51:05.432 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:51:05 compute-1 nova_compute[183083]: 2026-01-26 08:51:05.470 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:51:05 compute-1 nova_compute[183083]: 2026-01-26 08:51:05.482 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:51:05 compute-1 nova_compute[183083]: 2026-01-26 08:51:05.499 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:51:05 compute-1 nova_compute[183083]: 2026-01-26 08:51:05.499 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:51:05 compute-1 nova_compute[183083]: 2026-01-26 08:51:05.717 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:05 compute-1 ovn_controller[95352]: 2026-01-26T08:51:05Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c5:58:7b 10.100.0.6
Jan 26 08:51:05 compute-1 ovn_controller[95352]: 2026-01-26T08:51:05Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c5:58:7b 10.100.0.6
Jan 26 08:51:06 compute-1 nova_compute[183083]: 2026-01-26 08:51:06.499 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:51:06 compute-1 nova_compute[183083]: 2026-01-26 08:51:06.723 183087 INFO nova.compute.manager [None req-d248ff62-0229-4b65-bb2d-dc266a15ec90 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Get console output
Jan 26 08:51:06 compute-1 nova_compute[183083]: 2026-01-26 08:51:06.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:51:06 compute-1 nova_compute[183083]: 2026-01-26 08:51:06.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:51:07 compute-1 nova_compute[183083]: 2026-01-26 08:51:07.674 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:07 compute-1 nova_compute[183083]: 2026-01-26 08:51:07.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:51:10 compute-1 nova_compute[183083]: 2026-01-26 08:51:10.720 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.397 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "eefbc638-e91a-432b-84c3-2448060d96db" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.399 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "eefbc638-e91a-432b-84c3-2448060d96db" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.414 183087 DEBUG nova.compute.manager [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.481 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.482 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.491 183087 DEBUG nova.virt.hardware [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.491 183087 INFO nova.compute.claims [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.652 183087 DEBUG nova.compute.provider_tree [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.667 183087 DEBUG nova.scheduler.client.report [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.691 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.692 183087 DEBUG nova.compute.manager [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.735 183087 DEBUG nova.compute.manager [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.736 183087 DEBUG nova.network.neutron [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.755 183087 INFO nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.774 183087 DEBUG nova.compute.manager [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:51:11 compute-1 podman[216458]: 2026-01-26 08:51:11.824054071 +0000 UTC m=+0.079085144 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.867 183087 DEBUG nova.compute.manager [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.869 183087 DEBUG nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.870 183087 INFO nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Creating image(s)
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.871 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "/var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.871 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "/var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.872 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "/var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.897 183087 DEBUG oslo_concurrency.processutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.990 183087 DEBUG oslo_concurrency.processutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.992 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:51:11 compute-1 nova_compute[183083]: 2026-01-26 08:51:11.993 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.016 183087 DEBUG oslo_concurrency.processutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.088 183087 DEBUG oslo_concurrency.processutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.090 183087 DEBUG oslo_concurrency.processutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.152 183087 DEBUG oslo_concurrency.processutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/disk 1073741824" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.154 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.155 183087 DEBUG oslo_concurrency.processutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.185 183087 DEBUG nova.policy [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.240 183087 DEBUG oslo_concurrency.processutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.242 183087 DEBUG nova.virt.disk.api [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Checking if we can resize image /var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.242 183087 DEBUG oslo_concurrency.processutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.307 183087 DEBUG oslo_concurrency.processutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.309 183087 DEBUG nova.virt.disk.api [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Cannot resize image /var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.309 183087 DEBUG nova.objects.instance [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lazy-loading 'migration_context' on Instance uuid eefbc638-e91a-432b-84c3-2448060d96db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.333 183087 DEBUG nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.334 183087 DEBUG nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Ensure instance console log exists: /var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.334 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.335 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.336 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:51:12 compute-1 nova_compute[183083]: 2026-01-26 08:51:12.676 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:13 compute-1 ovn_controller[95352]: 2026-01-26T08:51:13Z|00189|pinctrl|WARN|Dropped 1253 log messages in last 69 seconds (most recently, 11 seconds ago) due to excessive rate
Jan 26 08:51:13 compute-1 ovn_controller[95352]: 2026-01-26T08:51:13Z|00190|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:51:13 compute-1 nova_compute[183083]: 2026-01-26 08:51:13.266 183087 DEBUG nova.network.neutron [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Successfully created port: 893a6c54-db8b-4ec4-93c3-99b76af58008 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:51:14 compute-1 nova_compute[183083]: 2026-01-26 08:51:14.288 183087 DEBUG nova.network.neutron [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Successfully updated port: 893a6c54-db8b-4ec4-93c3-99b76af58008 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:51:14 compute-1 nova_compute[183083]: 2026-01-26 08:51:14.303 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "refresh_cache-eefbc638-e91a-432b-84c3-2448060d96db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:51:14 compute-1 nova_compute[183083]: 2026-01-26 08:51:14.303 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquired lock "refresh_cache-eefbc638-e91a-432b-84c3-2448060d96db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:51:14 compute-1 nova_compute[183083]: 2026-01-26 08:51:14.304 183087 DEBUG nova.network.neutron [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:51:14 compute-1 nova_compute[183083]: 2026-01-26 08:51:14.383 183087 DEBUG nova.compute.manager [req-591bd406-61c8-4596-ab15-2cb3f1ecc5eb req-93f77741-ae0b-42fb-9ede-b2c58d843c8c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Received event network-changed-893a6c54-db8b-4ec4-93c3-99b76af58008 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:51:14 compute-1 nova_compute[183083]: 2026-01-26 08:51:14.384 183087 DEBUG nova.compute.manager [req-591bd406-61c8-4596-ab15-2cb3f1ecc5eb req-93f77741-ae0b-42fb-9ede-b2c58d843c8c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Refreshing instance network info cache due to event network-changed-893a6c54-db8b-4ec4-93c3-99b76af58008. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:51:14 compute-1 nova_compute[183083]: 2026-01-26 08:51:14.385 183087 DEBUG oslo_concurrency.lockutils [req-591bd406-61c8-4596-ab15-2cb3f1ecc5eb req-93f77741-ae0b-42fb-9ede-b2c58d843c8c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-eefbc638-e91a-432b-84c3-2448060d96db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:51:14 compute-1 nova_compute[183083]: 2026-01-26 08:51:14.448 183087 DEBUG nova.network.neutron [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.355 183087 DEBUG nova.network.neutron [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Updating instance_info_cache with network_info: [{"id": "893a6c54-db8b-4ec4-93c3-99b76af58008", "address": "fa:16:3e:31:21:03", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap893a6c54-db", "ovs_interfaceid": "893a6c54-db8b-4ec4-93c3-99b76af58008", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.379 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Releasing lock "refresh_cache-eefbc638-e91a-432b-84c3-2448060d96db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.380 183087 DEBUG nova.compute.manager [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Instance network_info: |[{"id": "893a6c54-db8b-4ec4-93c3-99b76af58008", "address": "fa:16:3e:31:21:03", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap893a6c54-db", "ovs_interfaceid": "893a6c54-db8b-4ec4-93c3-99b76af58008", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.381 183087 DEBUG oslo_concurrency.lockutils [req-591bd406-61c8-4596-ab15-2cb3f1ecc5eb req-93f77741-ae0b-42fb-9ede-b2c58d843c8c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-eefbc638-e91a-432b-84c3-2448060d96db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.382 183087 DEBUG nova.network.neutron [req-591bd406-61c8-4596-ab15-2cb3f1ecc5eb req-93f77741-ae0b-42fb-9ede-b2c58d843c8c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Refreshing network info cache for port 893a6c54-db8b-4ec4-93c3-99b76af58008 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.387 183087 DEBUG nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Start _get_guest_xml network_info=[{"id": "893a6c54-db8b-4ec4-93c3-99b76af58008", "address": "fa:16:3e:31:21:03", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap893a6c54-db", "ovs_interfaceid": "893a6c54-db8b-4ec4-93c3-99b76af58008", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encryption_options': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.395 183087 WARNING nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.405 183087 DEBUG nova.virt.libvirt.host [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.405 183087 DEBUG nova.virt.libvirt.host [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.411 183087 DEBUG nova.virt.libvirt.host [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.412 183087 DEBUG nova.virt.libvirt.host [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.412 183087 DEBUG nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.412 183087 DEBUG nova.virt.hardware [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T08:43:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7017323-cd35-470d-a718-faa6c6e97277',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.413 183087 DEBUG nova.virt.hardware [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.413 183087 DEBUG nova.virt.hardware [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.414 183087 DEBUG nova.virt.hardware [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.414 183087 DEBUG nova.virt.hardware [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.414 183087 DEBUG nova.virt.hardware [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.415 183087 DEBUG nova.virt.hardware [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.415 183087 DEBUG nova.virt.hardware [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.415 183087 DEBUG nova.virt.hardware [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.416 183087 DEBUG nova.virt.hardware [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.416 183087 DEBUG nova.virt.hardware [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.420 183087 DEBUG nova.virt.libvirt.vif [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:51:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1408106849',display_name='tempest-server-test-1408106849',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1408106849',id=40,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDY4V6+HgiJjlBp+sismnH//xIhvaHIyZ883PC4cnliN7XgVilpp1kBxLjUHPWqW57B2esyV5Ub1B2t4CQ7Ool6UJEN5MlISleL6j0D4sAsLrzbr7gNivUWbODLUMAl3Sw==',key_name='tempest-keypair-test-1986375941',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5d0c78b7cd584e4a90592d8ea01ce4ad',ramdisk_id='',reservation_id='r-tno0birl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkDefaultSecGroupTest-1876093813',owner_user_name='tempest-NetworkDefaultSecGroupTest-1876093813-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:51:11Z,user_data=None,user_id='988ebc31182f4c94813f94306e399a2d',uuid=eefbc638-e91a-432b-84c3-2448060d96db,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "893a6c54-db8b-4ec4-93c3-99b76af58008", "address": "fa:16:3e:31:21:03", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap893a6c54-db", "ovs_interfaceid": "893a6c54-db8b-4ec4-93c3-99b76af58008", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.421 183087 DEBUG nova.network.os_vif_util [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converting VIF {"id": "893a6c54-db8b-4ec4-93c3-99b76af58008", "address": "fa:16:3e:31:21:03", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap893a6c54-db", "ovs_interfaceid": "893a6c54-db8b-4ec4-93c3-99b76af58008", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.422 183087 DEBUG nova.network.os_vif_util [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:21:03,bridge_name='br-int',has_traffic_filtering=True,id=893a6c54-db8b-4ec4-93c3-99b76af58008,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap893a6c54-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.423 183087 DEBUG nova.objects.instance [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lazy-loading 'pci_devices' on Instance uuid eefbc638-e91a-432b-84c3-2448060d96db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.439 183087 DEBUG nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] End _get_guest_xml xml=<domain type="kvm">
Jan 26 08:51:15 compute-1 nova_compute[183083]:   <uuid>eefbc638-e91a-432b-84c3-2448060d96db</uuid>
Jan 26 08:51:15 compute-1 nova_compute[183083]:   <name>instance-00000028</name>
Jan 26 08:51:15 compute-1 nova_compute[183083]:   <memory>131072</memory>
Jan 26 08:51:15 compute-1 nova_compute[183083]:   <vcpu>1</vcpu>
Jan 26 08:51:15 compute-1 nova_compute[183083]:   <metadata>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <nova:name>tempest-server-test-1408106849</nova:name>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <nova:creationTime>2026-01-26 08:51:15</nova:creationTime>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <nova:flavor name="m1.nano">
Jan 26 08:51:15 compute-1 nova_compute[183083]:         <nova:memory>128</nova:memory>
Jan 26 08:51:15 compute-1 nova_compute[183083]:         <nova:disk>1</nova:disk>
Jan 26 08:51:15 compute-1 nova_compute[183083]:         <nova:swap>0</nova:swap>
Jan 26 08:51:15 compute-1 nova_compute[183083]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 08:51:15 compute-1 nova_compute[183083]:         <nova:vcpus>1</nova:vcpus>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       </nova:flavor>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <nova:owner>
Jan 26 08:51:15 compute-1 nova_compute[183083]:         <nova:user uuid="988ebc31182f4c94813f94306e399a2d">tempest-NetworkDefaultSecGroupTest-1876093813-project-member</nova:user>
Jan 26 08:51:15 compute-1 nova_compute[183083]:         <nova:project uuid="5d0c78b7cd584e4a90592d8ea01ce4ad">tempest-NetworkDefaultSecGroupTest-1876093813</nova:project>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       </nova:owner>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <nova:root type="image" uuid="13d1a20a-8003-4f19-aba7-ccbd9eff9b82"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <nova:ports>
Jan 26 08:51:15 compute-1 nova_compute[183083]:         <nova:port uuid="893a6c54-db8b-4ec4-93c3-99b76af58008">
Jan 26 08:51:15 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       </nova:ports>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     </nova:instance>
Jan 26 08:51:15 compute-1 nova_compute[183083]:   </metadata>
Jan 26 08:51:15 compute-1 nova_compute[183083]:   <sysinfo type="smbios">
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <system>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <entry name="manufacturer">RDO</entry>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <entry name="product">OpenStack Compute</entry>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <entry name="serial">eefbc638-e91a-432b-84c3-2448060d96db</entry>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <entry name="uuid">eefbc638-e91a-432b-84c3-2448060d96db</entry>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <entry name="family">Virtual Machine</entry>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     </system>
Jan 26 08:51:15 compute-1 nova_compute[183083]:   </sysinfo>
Jan 26 08:51:15 compute-1 nova_compute[183083]:   <os>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <boot dev="hd"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <smbios mode="sysinfo"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:   </os>
Jan 26 08:51:15 compute-1 nova_compute[183083]:   <features>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <acpi/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <apic/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <vmcoreinfo/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:   </features>
Jan 26 08:51:15 compute-1 nova_compute[183083]:   <clock offset="utc">
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <timer name="hpet" present="no"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:   </clock>
Jan 26 08:51:15 compute-1 nova_compute[183083]:   <cpu mode="host-model" match="exact">
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:   </cpu>
Jan 26 08:51:15 compute-1 nova_compute[183083]:   <devices>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <disk type="file" device="disk">
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/disk"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <target dev="vda" bus="virtio"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <disk type="file" device="cdrom">
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/disk.config"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <target dev="sda" bus="sata"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:31:21:03"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <target dev="tap893a6c54-db"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <serial type="pty">
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <log file="/var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/console.log" append="off"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     </serial>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <video>
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     </video>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <input type="tablet" bus="usb"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <rng model="virtio">
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <backend model="random">/dev/urandom</backend>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     </rng>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <controller type="usb" index="0"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     <memballoon model="virtio">
Jan 26 08:51:15 compute-1 nova_compute[183083]:       <stats period="10"/>
Jan 26 08:51:15 compute-1 nova_compute[183083]:     </memballoon>
Jan 26 08:51:15 compute-1 nova_compute[183083]:   </devices>
Jan 26 08:51:15 compute-1 nova_compute[183083]: </domain>
Jan 26 08:51:15 compute-1 nova_compute[183083]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.440 183087 DEBUG nova.compute.manager [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Preparing to wait for external event network-vif-plugged-893a6c54-db8b-4ec4-93c3-99b76af58008 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.441 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "eefbc638-e91a-432b-84c3-2448060d96db-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.441 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "eefbc638-e91a-432b-84c3-2448060d96db-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.442 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "eefbc638-e91a-432b-84c3-2448060d96db-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.443 183087 DEBUG nova.virt.libvirt.vif [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:51:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1408106849',display_name='tempest-server-test-1408106849',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1408106849',id=40,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDY4V6+HgiJjlBp+sismnH//xIhvaHIyZ883PC4cnliN7XgVilpp1kBxLjUHPWqW57B2esyV5Ub1B2t4CQ7Ool6UJEN5MlISleL6j0D4sAsLrzbr7gNivUWbODLUMAl3Sw==',key_name='tempest-keypair-test-1986375941',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5d0c78b7cd584e4a90592d8ea01ce4ad',ramdisk_id='',reservation_id='r-tno0birl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkDefaultSecGroupTest-1876093813',owner_user_name='tempest-NetworkDefaultSecGroupTest-1876093813-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:51:11Z,user_data=None,user_id='988ebc31182f4c94813f94306e399a2d',uuid=eefbc638-e91a-432b-84c3-2448060d96db,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "893a6c54-db8b-4ec4-93c3-99b76af58008", "address": "fa:16:3e:31:21:03", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap893a6c54-db", "ovs_interfaceid": "893a6c54-db8b-4ec4-93c3-99b76af58008", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.444 183087 DEBUG nova.network.os_vif_util [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converting VIF {"id": "893a6c54-db8b-4ec4-93c3-99b76af58008", "address": "fa:16:3e:31:21:03", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap893a6c54-db", "ovs_interfaceid": "893a6c54-db8b-4ec4-93c3-99b76af58008", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.445 183087 DEBUG nova.network.os_vif_util [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:21:03,bridge_name='br-int',has_traffic_filtering=True,id=893a6c54-db8b-4ec4-93c3-99b76af58008,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap893a6c54-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.446 183087 DEBUG os_vif [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:21:03,bridge_name='br-int',has_traffic_filtering=True,id=893a6c54-db8b-4ec4-93c3-99b76af58008,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap893a6c54-db') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.447 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.448 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.448 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.454 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.455 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap893a6c54-db, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.455 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap893a6c54-db, col_values=(('external_ids', {'iface-id': '893a6c54-db8b-4ec4-93c3-99b76af58008', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:31:21:03', 'vm-uuid': 'eefbc638-e91a-432b-84c3-2448060d96db'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.458 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:15 compute-1 NetworkManager[55451]: <info>  [1769417475.4596] manager: (tap893a6c54-db): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.461 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.469 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.472 183087 INFO os_vif [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:21:03,bridge_name='br-int',has_traffic_filtering=True,id=893a6c54-db8b-4ec4-93c3-99b76af58008,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap893a6c54-db')
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.524 183087 DEBUG nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.525 183087 DEBUG nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.525 183087 DEBUG nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] No VIF found with MAC fa:16:3e:31:21:03, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.526 183087 INFO nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Using config drive
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.723 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.873 183087 INFO nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Creating config drive at /var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/disk.config
Jan 26 08:51:15 compute-1 nova_compute[183083]: 2026-01-26 08:51:15.882 183087 DEBUG oslo_concurrency.processutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr0z6tqrp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.025 183087 DEBUG oslo_concurrency.processutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr0z6tqrp" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:51:16 compute-1 kernel: tap893a6c54-db: entered promiscuous mode
Jan 26 08:51:16 compute-1 NetworkManager[55451]: <info>  [1769417476.1089] manager: (tap893a6c54-db): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Jan 26 08:51:16 compute-1 ovn_controller[95352]: 2026-01-26T08:51:16Z|00191|binding|INFO|Claiming lport 893a6c54-db8b-4ec4-93c3-99b76af58008 for this chassis.
Jan 26 08:51:16 compute-1 ovn_controller[95352]: 2026-01-26T08:51:16Z|00192|binding|INFO|893a6c54-db8b-4ec4-93c3-99b76af58008: Claiming fa:16:3e:31:21:03 10.100.0.11
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.112 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:51:16.122 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:21:03 10.100.0.11'], port_security=['fa:16:3e:31:21:03 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cf9da6cb-556c-4ee2-9313-239643d15cb1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97b37269-bb63-498a-80e2-0f1154aea97c, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=893a6c54-db8b-4ec4-93c3-99b76af58008) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:51:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:51:16.125 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 893a6c54-db8b-4ec4-93c3-99b76af58008 in datapath ce3cd186-bdaf-40d4-a276-e9139fe3dfec bound to our chassis
Jan 26 08:51:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:51:16.128 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ce3cd186-bdaf-40d4-a276-e9139fe3dfec
Jan 26 08:51:16 compute-1 ovn_controller[95352]: 2026-01-26T08:51:16Z|00193|binding|INFO|Setting lport 893a6c54-db8b-4ec4-93c3-99b76af58008 ovn-installed in OVS
Jan 26 08:51:16 compute-1 ovn_controller[95352]: 2026-01-26T08:51:16Z|00194|binding|INFO|Setting lport 893a6c54-db8b-4ec4-93c3-99b76af58008 up in Southbound
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.139 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.142 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:51:16.156 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[ffa25941-e4e0-46d7-9f8b-38740bc9b65b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:51:16 compute-1 systemd-udevd[216518]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:51:16 compute-1 systemd-machined[154360]: New machine qemu-11-instance-00000028.
Jan 26 08:51:16 compute-1 systemd[1]: Started Virtual Machine qemu-11-instance-00000028.
Jan 26 08:51:16 compute-1 NetworkManager[55451]: <info>  [1769417476.1903] device (tap893a6c54-db): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 08:51:16 compute-1 NetworkManager[55451]: <info>  [1769417476.1916] device (tap893a6c54-db): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 08:51:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:51:16.204 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[9f484fd6-f25a-465d-93fd-4a9e19c975c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:51:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:51:16.208 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[8e2b5eb5-ec60-4bcb-8937-fcc34fb44ade]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:51:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:51:16.252 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[080aaff1-99ba-49b0-ae99-39238fe965d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:51:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:51:16.280 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[bbfbd587-f062-49fe-bbe7-47bb74dc7eb4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce3cd186-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:90:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379418, 'reachable_time': 30482, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216529, 'error': None, 'target': 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:51:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:51:16.306 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[8a2facc7-01ab-4b11-bf44-57ae4011405d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapce3cd186-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379434, 'tstamp': 379434}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216531, 'error': None, 'target': 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapce3cd186-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379439, 'tstamp': 379439}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216531, 'error': None, 'target': 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:51:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:51:16.309 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce3cd186-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.312 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.313 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:51:16.314 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce3cd186-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:51:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:51:16.315 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:51:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:51:16.316 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapce3cd186-b0, col_values=(('external_ids', {'iface-id': '597f7eff-a379-4a5d-bffc-0e294ae89e61'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:51:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:51:16.317 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.498 183087 DEBUG nova.compute.manager [req-c0ae19b2-4234-4164-b43a-172d7bf9705e req-a850ae8e-acfa-489a-9126-1fd3af5cf066 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Received event network-vif-plugged-893a6c54-db8b-4ec4-93c3-99b76af58008 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.499 183087 DEBUG oslo_concurrency.lockutils [req-c0ae19b2-4234-4164-b43a-172d7bf9705e req-a850ae8e-acfa-489a-9126-1fd3af5cf066 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "eefbc638-e91a-432b-84c3-2448060d96db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.499 183087 DEBUG oslo_concurrency.lockutils [req-c0ae19b2-4234-4164-b43a-172d7bf9705e req-a850ae8e-acfa-489a-9126-1fd3af5cf066 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "eefbc638-e91a-432b-84c3-2448060d96db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.500 183087 DEBUG oslo_concurrency.lockutils [req-c0ae19b2-4234-4164-b43a-172d7bf9705e req-a850ae8e-acfa-489a-9126-1fd3af5cf066 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "eefbc638-e91a-432b-84c3-2448060d96db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.500 183087 DEBUG nova.compute.manager [req-c0ae19b2-4234-4164-b43a-172d7bf9705e req-a850ae8e-acfa-489a-9126-1fd3af5cf066 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Processing event network-vif-plugged-893a6c54-db8b-4ec4-93c3-99b76af58008 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.644 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417476.6440072, eefbc638-e91a-432b-84c3-2448060d96db => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.645 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: eefbc638-e91a-432b-84c3-2448060d96db] VM Started (Lifecycle Event)
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.648 183087 DEBUG nova.compute.manager [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.654 183087 DEBUG nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.658 183087 INFO nova.virt.libvirt.driver [-] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Instance spawned successfully.
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.661 183087 DEBUG nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.664 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.669 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.681 183087 DEBUG nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.681 183087 DEBUG nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.682 183087 DEBUG nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.682 183087 DEBUG nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.682 183087 DEBUG nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.683 183087 DEBUG nova.virt.libvirt.driver [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.688 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: eefbc638-e91a-432b-84c3-2448060d96db] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.689 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417476.6475391, eefbc638-e91a-432b-84c3-2448060d96db => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.689 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: eefbc638-e91a-432b-84c3-2448060d96db] VM Paused (Lifecycle Event)
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.702 183087 DEBUG nova.network.neutron [req-591bd406-61c8-4596-ab15-2cb3f1ecc5eb req-93f77741-ae0b-42fb-9ede-b2c58d843c8c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Updated VIF entry in instance network info cache for port 893a6c54-db8b-4ec4-93c3-99b76af58008. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.703 183087 DEBUG nova.network.neutron [req-591bd406-61c8-4596-ab15-2cb3f1ecc5eb req-93f77741-ae0b-42fb-9ede-b2c58d843c8c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Updating instance_info_cache with network_info: [{"id": "893a6c54-db8b-4ec4-93c3-99b76af58008", "address": "fa:16:3e:31:21:03", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap893a6c54-db", "ovs_interfaceid": "893a6c54-db8b-4ec4-93c3-99b76af58008", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.726 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.730 183087 DEBUG oslo_concurrency.lockutils [req-591bd406-61c8-4596-ab15-2cb3f1ecc5eb req-93f77741-ae0b-42fb-9ede-b2c58d843c8c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-eefbc638-e91a-432b-84c3-2448060d96db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.731 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417476.6528602, eefbc638-e91a-432b-84c3-2448060d96db => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.731 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: eefbc638-e91a-432b-84c3-2448060d96db] VM Resumed (Lifecycle Event)
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.753 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.757 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.775 183087 INFO nova.compute.manager [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Took 4.91 seconds to spawn the instance on the hypervisor.
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.776 183087 DEBUG nova.compute.manager [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.784 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: eefbc638-e91a-432b-84c3-2448060d96db] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.858 183087 INFO nova.compute.manager [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Took 5.40 seconds to build instance.
Jan 26 08:51:16 compute-1 nova_compute[183083]: 2026-01-26 08:51:16.873 183087 DEBUG oslo_concurrency.lockutils [None req-43294c5b-40a5-4542-a913-e9d59a7b8432 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "eefbc638-e91a-432b-84c3-2448060d96db" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.475s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:51:18 compute-1 nova_compute[183083]: 2026-01-26 08:51:18.421 183087 INFO nova.compute.manager [None req-1c3bbee6-7d54-4715-bb98-751412a10101 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Get console output
Jan 26 08:51:18 compute-1 nova_compute[183083]: 2026-01-26 08:51:18.608 183087 DEBUG nova.compute.manager [req-6818531c-7603-4e14-82bc-06bc0bb2615e req-18add148-6969-4904-98ee-e96de1dc7e14 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Received event network-vif-plugged-893a6c54-db8b-4ec4-93c3-99b76af58008 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:51:18 compute-1 nova_compute[183083]: 2026-01-26 08:51:18.608 183087 DEBUG oslo_concurrency.lockutils [req-6818531c-7603-4e14-82bc-06bc0bb2615e req-18add148-6969-4904-98ee-e96de1dc7e14 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "eefbc638-e91a-432b-84c3-2448060d96db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:51:18 compute-1 nova_compute[183083]: 2026-01-26 08:51:18.609 183087 DEBUG oslo_concurrency.lockutils [req-6818531c-7603-4e14-82bc-06bc0bb2615e req-18add148-6969-4904-98ee-e96de1dc7e14 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "eefbc638-e91a-432b-84c3-2448060d96db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:51:18 compute-1 nova_compute[183083]: 2026-01-26 08:51:18.609 183087 DEBUG oslo_concurrency.lockutils [req-6818531c-7603-4e14-82bc-06bc0bb2615e req-18add148-6969-4904-98ee-e96de1dc7e14 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "eefbc638-e91a-432b-84c3-2448060d96db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:51:18 compute-1 nova_compute[183083]: 2026-01-26 08:51:18.610 183087 DEBUG nova.compute.manager [req-6818531c-7603-4e14-82bc-06bc0bb2615e req-18add148-6969-4904-98ee-e96de1dc7e14 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] No waiting events found dispatching network-vif-plugged-893a6c54-db8b-4ec4-93c3-99b76af58008 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:51:18 compute-1 nova_compute[183083]: 2026-01-26 08:51:18.610 183087 WARNING nova.compute.manager [req-6818531c-7603-4e14-82bc-06bc0bb2615e req-18add148-6969-4904-98ee-e96de1dc7e14 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Received unexpected event network-vif-plugged-893a6c54-db8b-4ec4-93c3-99b76af58008 for instance with vm_state active and task_state None.
Jan 26 08:51:20 compute-1 nova_compute[183083]: 2026-01-26 08:51:20.461 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:20 compute-1 nova_compute[183083]: 2026-01-26 08:51:20.724 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:23 compute-1 nova_compute[183083]: 2026-01-26 08:51:23.553 183087 INFO nova.compute.manager [None req-29bcf859-1d66-4327-8ffa-789807efe027 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Get console output
Jan 26 08:51:25 compute-1 nova_compute[183083]: 2026-01-26 08:51:25.465 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:25 compute-1 nova_compute[183083]: 2026-01-26 08:51:25.726 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:26 compute-1 podman[216539]: 2026-01-26 08:51:26.864190613 +0000 UTC m=+0.108959547 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 08:51:26 compute-1 podman[216540]: 2026-01-26 08:51:26.882039007 +0000 UTC m=+0.104668946 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Jan 26 08:51:28 compute-1 nova_compute[183083]: 2026-01-26 08:51:28.682 183087 INFO nova.compute.manager [None req-d3bc3fb9-b2ee-4cf0-b50f-2addff36b329 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Get console output
Jan 26 08:51:28 compute-1 ovn_controller[95352]: 2026-01-26T08:51:28Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:31:21:03 10.100.0.11
Jan 26 08:51:28 compute-1 ovn_controller[95352]: 2026-01-26T08:51:28Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:31:21:03 10.100.0.11
Jan 26 08:51:30 compute-1 nova_compute[183083]: 2026-01-26 08:51:30.470 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:30 compute-1 nova_compute[183083]: 2026-01-26 08:51:30.585 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:30 compute-1 NetworkManager[55451]: <info>  [1769417490.5871] manager: (patch-provnet-149e76db-406a-40c9-b6a7-879b1da420de-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Jan 26 08:51:30 compute-1 ovn_controller[95352]: 2026-01-26T08:51:30Z|00195|binding|INFO|Releasing lport 597f7eff-a379-4a5d-bffc-0e294ae89e61 from this chassis (sb_readonly=0)
Jan 26 08:51:30 compute-1 NetworkManager[55451]: <info>  [1769417490.5892] manager: (patch-br-int-to-provnet-149e76db-406a-40c9-b6a7-879b1da420de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Jan 26 08:51:30 compute-1 ovn_controller[95352]: 2026-01-26T08:51:30Z|00196|binding|INFO|Releasing lport 597f7eff-a379-4a5d-bffc-0e294ae89e61 from this chassis (sb_readonly=0)
Jan 26 08:51:30 compute-1 nova_compute[183083]: 2026-01-26 08:51:30.640 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:30 compute-1 nova_compute[183083]: 2026-01-26 08:51:30.647 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:30 compute-1 nova_compute[183083]: 2026-01-26 08:51:30.728 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:30 compute-1 nova_compute[183083]: 2026-01-26 08:51:30.870 183087 DEBUG nova.compute.manager [req-2a4146fe-a4aa-4524-b850-8c2bdba51748 req-f220d45d-196e-4dcf-abf1-1baf42035130 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Received event network-changed-fb4faeb1-e827-462b-8f21-36892b978052 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:51:30 compute-1 nova_compute[183083]: 2026-01-26 08:51:30.871 183087 DEBUG nova.compute.manager [req-2a4146fe-a4aa-4524-b850-8c2bdba51748 req-f220d45d-196e-4dcf-abf1-1baf42035130 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Refreshing instance network info cache due to event network-changed-fb4faeb1-e827-462b-8f21-36892b978052. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:51:30 compute-1 nova_compute[183083]: 2026-01-26 08:51:30.872 183087 DEBUG oslo_concurrency.lockutils [req-2a4146fe-a4aa-4524-b850-8c2bdba51748 req-f220d45d-196e-4dcf-abf1-1baf42035130 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-70d16f06-3d0a-454f-a1dd-87ce77ed8582" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:51:30 compute-1 nova_compute[183083]: 2026-01-26 08:51:30.872 183087 DEBUG oslo_concurrency.lockutils [req-2a4146fe-a4aa-4524-b850-8c2bdba51748 req-f220d45d-196e-4dcf-abf1-1baf42035130 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-70d16f06-3d0a-454f-a1dd-87ce77ed8582" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:51:30 compute-1 nova_compute[183083]: 2026-01-26 08:51:30.873 183087 DEBUG nova.network.neutron [req-2a4146fe-a4aa-4524-b850-8c2bdba51748 req-f220d45d-196e-4dcf-abf1-1baf42035130 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Refreshing network info cache for port fb4faeb1-e827-462b-8f21-36892b978052 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:51:32 compute-1 podman[216596]: 2026-01-26 08:51:32.848563047 +0000 UTC m=+0.104792040 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 08:51:32 compute-1 podman[216597]: 2026-01-26 08:51:32.852728144 +0000 UTC m=+0.100541769 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 08:51:32 compute-1 podman[216595]: 2026-01-26 08:51:32.873125 +0000 UTC m=+0.132204533 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Jan 26 08:51:33 compute-1 nova_compute[183083]: 2026-01-26 08:51:33.301 183087 DEBUG nova.network.neutron [req-2a4146fe-a4aa-4524-b850-8c2bdba51748 req-f220d45d-196e-4dcf-abf1-1baf42035130 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Updated VIF entry in instance network info cache for port fb4faeb1-e827-462b-8f21-36892b978052. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:51:33 compute-1 nova_compute[183083]: 2026-01-26 08:51:33.301 183087 DEBUG nova.network.neutron [req-2a4146fe-a4aa-4524-b850-8c2bdba51748 req-f220d45d-196e-4dcf-abf1-1baf42035130 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Updating instance_info_cache with network_info: [{"id": "fb4faeb1-e827-462b-8f21-36892b978052", "address": "fa:16:3e:c5:58:7b", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb4faeb1-e8", "ovs_interfaceid": "fb4faeb1-e827-462b-8f21-36892b978052", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:51:33 compute-1 nova_compute[183083]: 2026-01-26 08:51:33.324 183087 DEBUG oslo_concurrency.lockutils [req-2a4146fe-a4aa-4524-b850-8c2bdba51748 req-f220d45d-196e-4dcf-abf1-1baf42035130 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-70d16f06-3d0a-454f-a1dd-87ce77ed8582" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:51:33 compute-1 nova_compute[183083]: 2026-01-26 08:51:33.477 183087 DEBUG nova.compute.manager [req-7e8da581-1ba6-45fa-b01d-1ed8b9dac165 req-929dcd76-f93a-42d1-86dc-0b5494689902 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Received event network-changed-893a6c54-db8b-4ec4-93c3-99b76af58008 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:51:33 compute-1 nova_compute[183083]: 2026-01-26 08:51:33.478 183087 DEBUG nova.compute.manager [req-7e8da581-1ba6-45fa-b01d-1ed8b9dac165 req-929dcd76-f93a-42d1-86dc-0b5494689902 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Refreshing instance network info cache due to event network-changed-893a6c54-db8b-4ec4-93c3-99b76af58008. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:51:33 compute-1 nova_compute[183083]: 2026-01-26 08:51:33.479 183087 DEBUG oslo_concurrency.lockutils [req-7e8da581-1ba6-45fa-b01d-1ed8b9dac165 req-929dcd76-f93a-42d1-86dc-0b5494689902 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-eefbc638-e91a-432b-84c3-2448060d96db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:51:33 compute-1 nova_compute[183083]: 2026-01-26 08:51:33.479 183087 DEBUG oslo_concurrency.lockutils [req-7e8da581-1ba6-45fa-b01d-1ed8b9dac165 req-929dcd76-f93a-42d1-86dc-0b5494689902 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-eefbc638-e91a-432b-84c3-2448060d96db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:51:33 compute-1 nova_compute[183083]: 2026-01-26 08:51:33.480 183087 DEBUG nova.network.neutron [req-7e8da581-1ba6-45fa-b01d-1ed8b9dac165 req-929dcd76-f93a-42d1-86dc-0b5494689902 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Refreshing network info cache for port 893a6c54-db8b-4ec4-93c3-99b76af58008 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:51:34 compute-1 nova_compute[183083]: 2026-01-26 08:51:34.643 183087 DEBUG nova.network.neutron [req-7e8da581-1ba6-45fa-b01d-1ed8b9dac165 req-929dcd76-f93a-42d1-86dc-0b5494689902 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Updated VIF entry in instance network info cache for port 893a6c54-db8b-4ec4-93c3-99b76af58008. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:51:34 compute-1 nova_compute[183083]: 2026-01-26 08:51:34.643 183087 DEBUG nova.network.neutron [req-7e8da581-1ba6-45fa-b01d-1ed8b9dac165 req-929dcd76-f93a-42d1-86dc-0b5494689902 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Updating instance_info_cache with network_info: [{"id": "893a6c54-db8b-4ec4-93c3-99b76af58008", "address": "fa:16:3e:31:21:03", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap893a6c54-db", "ovs_interfaceid": "893a6c54-db8b-4ec4-93c3-99b76af58008", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:51:34 compute-1 nova_compute[183083]: 2026-01-26 08:51:34.664 183087 DEBUG oslo_concurrency.lockutils [req-7e8da581-1ba6-45fa-b01d-1ed8b9dac165 req-929dcd76-f93a-42d1-86dc-0b5494689902 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-eefbc638-e91a-432b-84c3-2448060d96db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:51:35 compute-1 nova_compute[183083]: 2026-01-26 08:51:35.474 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:35 compute-1 nova_compute[183083]: 2026-01-26 08:51:35.731 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:40 compute-1 nova_compute[183083]: 2026-01-26 08:51:40.478 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:40 compute-1 nova_compute[183083]: 2026-01-26 08:51:40.733 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:42 compute-1 podman[216659]: 2026-01-26 08:51:42.854256582 +0000 UTC m=+0.092047239 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 08:51:45 compute-1 nova_compute[183083]: 2026-01-26 08:51:45.482 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:45 compute-1 nova_compute[183083]: 2026-01-26 08:51:45.736 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:50 compute-1 sshd-session[216683]: Connection closed by authenticating user root 159.223.236.81 port 40336 [preauth]
Jan 26 08:51:50 compute-1 nova_compute[183083]: 2026-01-26 08:51:50.536 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:50 compute-1 nova_compute[183083]: 2026-01-26 08:51:50.738 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:55 compute-1 nova_compute[183083]: 2026-01-26 08:51:55.592 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:55 compute-1 nova_compute[183083]: 2026-01-26 08:51:55.741 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:51:56 compute-1 nova_compute[183083]: 2026-01-26 08:51:56.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:51:56 compute-1 nova_compute[183083]: 2026-01-26 08:51:56.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 26 08:51:57 compute-1 nova_compute[183083]: 2026-01-26 08:51:57.009 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 26 08:51:57 compute-1 podman[216686]: 2026-01-26 08:51:57.833684001 +0000 UTC m=+0.089158828 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350)
Jan 26 08:51:57 compute-1 podman[216685]: 2026-01-26 08:51:57.838901248 +0000 UTC m=+0.097108253 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:51:58 compute-1 nova_compute[183083]: 2026-01-26 08:51:58.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:51:58 compute-1 nova_compute[183083]: 2026-01-26 08:51:58.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 26 08:52:00 compute-1 nova_compute[183083]: 2026-01-26 08:52:00.595 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:00 compute-1 nova_compute[183083]: 2026-01-26 08:52:00.742 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:02 compute-1 ovn_controller[95352]: 2026-01-26T08:52:02Z|00197|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 26 08:52:02 compute-1 nova_compute[183083]: 2026-01-26 08:52:02.962 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:52:02 compute-1 nova_compute[183083]: 2026-01-26 08:52:02.963 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:52:02 compute-1 nova_compute[183083]: 2026-01-26 08:52:02.963 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:52:02 compute-1 nova_compute[183083]: 2026-01-26 08:52:02.964 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:52:03 compute-1 nova_compute[183083]: 2026-01-26 08:52:03.222 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "refresh_cache-70d16f06-3d0a-454f-a1dd-87ce77ed8582" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:52:03 compute-1 nova_compute[183083]: 2026-01-26 08:52:03.222 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquired lock "refresh_cache-70d16f06-3d0a-454f-a1dd-87ce77ed8582" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:52:03 compute-1 nova_compute[183083]: 2026-01-26 08:52:03.223 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 08:52:03 compute-1 nova_compute[183083]: 2026-01-26 08:52:03.223 183087 DEBUG nova.objects.instance [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 70d16f06-3d0a-454f-a1dd-87ce77ed8582 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.743 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'name': 'tempest-server-test-2078215968', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000027', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'user_id': '988ebc31182f4c94813f94306e399a2d', 'hostId': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.752 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'name': 'tempest-server-test-1408106849', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000028', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'user_id': '988ebc31182f4c94813f94306e399a2d', 'hostId': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.753 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.753 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.753 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-2078215968>, <NovaLikeServer: tempest-server-test-1408106849>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2078215968>, <NovaLikeServer: tempest-server-test-1408106849>]
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.753 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.753 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.753 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-2078215968>, <NovaLikeServer: tempest-server-test-1408106849>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2078215968>, <NovaLikeServer: tempest-server-test-1408106849>]
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.754 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.796 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.device.write.latency volume: 1445494570 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.797 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 podman[216730]: 2026-01-26 08:52:03.831720828 +0000 UTC m=+0.087011608 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 08:52:03 compute-1 ovn_controller[95352]: 2026-01-26T08:52:03Z|00198|pinctrl|WARN|Dropped 191 log messages in last 51 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 26 08:52:03 compute-1 ovn_controller[95352]: 2026-01-26T08:52:03Z|00199|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.851 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/disk.device.write.latency volume: 2869948467 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.851 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7e46b8c-9d88-47ad-93bd-2983c4e963a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1445494570, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582-vda', 'timestamp': '2026-01-26T08:52:03.754371', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '492862aa-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.416348805, 'message_signature': 'af2446a7288fd1dc8aaa684ea80de2d0f2e52d396396f76771972a714336d389'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582-sda', 'timestamp': '2026-01-26T08:52:03.754371', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '49287cd6-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.416348805, 'message_signature': '15f4a6892a79bfc2afe6d66b2f4c16c012928c42fdb0c23a0db13d41a7c94d13'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2869948467, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db-vda', 'timestamp': '2026-01-26T08:52:03.754371', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4930ad34-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.460205493, 'message_signature': '62f4fa6ac605174a964ef184056f28557359323445da10ba46e0b89312872890'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db-sda', 'timestamp': '2026-01-26T08:52:03.754371', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4930b900-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.460205493, 'message_signature': 'ef97b4e37776c89d2da035a9eade57fec0601f793f37df03a970a23dbc12046a'}]}, 'timestamp': '2026-01-26 08:52:03.852032', '_unique_id': 'a9a79a2654d44b23b4db1ce6c9d3104c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.853 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 26 08:52:03 compute-1 podman[216728]: 2026-01-26 08:52:03.867548479 +0000 UTC m=+0.123099606 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 26 08:52:03 compute-1 podman[216729]: 2026-01-26 08:52:03.86971003 +0000 UTC m=+0.120235865 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.872 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/memory.usage volume: 46.765625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.885 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/memory.usage volume: 43.07421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb76a127-6bd9-49f3-934b-9604c869f136', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.765625, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'timestamp': '2026-01-26T08:52:03.854037', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '4933d7b6-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.533874183, 'message_signature': '790802240bd63e8ce44c88179f05cb53ca1aa9895462ab89b8056835fc8aa83d'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.07421875, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'timestamp': '2026-01-26T08:52:03.854037', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '4935e3d0-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.547452787, 'message_signature': 'd25ecf5f180a861e83966266c2753d438818431dd01c7ceff011f5ca0faebb5c'}]}, 'timestamp': '2026-01-26 08:52:03.885909', '_unique_id': '5a95f0ac29914035994edfa97d9bab7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.886 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.887 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.887 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.887 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-2078215968>, <NovaLikeServer: tempest-server-test-1408106849>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2078215968>, <NovaLikeServer: tempest-server-test-1408106849>]
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.887 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.887 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.device.read.requests volume: 1075 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.888 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.888 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/disk.device.read.requests volume: 1139 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.888 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1682a58-1f71-41a0-aea8-9224e22bf716', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1075, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582-vda', 'timestamp': '2026-01-26T08:52:03.887847', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '493639ca-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.416348805, 'message_signature': '90d45490adeba27d1dc626764aa6efbc02a654fc2e404fdd3b4fa1452fbdb28a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582-sda', 'timestamp': '2026-01-26T08:52:03.887847', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '493642c6-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.416348805, 'message_signature': 'f2d3c990fd3e84a756687d7bf8492d5b99edc7ecebad70589519d9995acc34c1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1139, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db-vda', 'timestamp': '2026-01-26T08:52:03.887847', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '49364a64-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.460205493, 'message_signature': 'b5eece9a0df20cc07cb282fa887eee5a7e17dd743cecffc7e36161f17d5e4f3c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db-sda', 'timestamp': '2026-01-26T08:52:03.887847', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '493651b2-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.460205493, 'message_signature': 'f2938d0a3e9392c4e3fd0876d4e19d22bd129017a2cf25c42efa8ee5885e6e5c'}]}, 'timestamp': '2026-01-26 08:52:03.888673', '_unique_id': '8049f8474c104b079de0c5f1902e5ab1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.889 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.device.read.bytes volume: 29997568 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.890 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.890 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/disk.device.read.bytes volume: 31037952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.890 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1be7ce50-7c44-4697-9047-cacc94980fee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29997568, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582-vda', 'timestamp': '2026-01-26T08:52:03.889922', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '49368b50-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.416348805, 'message_signature': 'b8bf2c23ce90f68082fc9b7ea8f502b172f938d2207dba0276360e7013988b87'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582-sda', 'timestamp': '2026-01-26T08:52:03.889922', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '493694ce-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.416348805, 'message_signature': 'e081e199bb05b800eb13fef9d8fa757e3a15489d5862ca3ebe0f02b4bf64644c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31037952, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db-vda', 'timestamp': '2026-01-26T08:52:03.889922', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '49369c80-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.460205493, 'message_signature': '2cf6d683fb65e6bf879f1d9893e8f63eee727f6e02fcb45314ac573b0c193f10'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db-sda', 'timestamp': '2026-01-26T08:52:03.889922', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4936a4aa-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.460205493, 'message_signature': '73fb0be7ed064dcc3760e5628dc6b07a8dc234a4281fe5e725d5c4cc695a7e55'}]}, 'timestamp': '2026-01-26 08:52:03.890811', '_unique_id': '914c8569cfb348668cf2b0620563d8f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.891 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.901 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.device.allocation volume: 31006720 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.901 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.909 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/disk.device.allocation volume: 31072256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.909 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '134497c9-c480-4782-8801-311ffeccca3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31006720, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582-vda', 'timestamp': '2026-01-26T08:52:03.892009', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '493847d8-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.553983611, 'message_signature': '4fd4a2bc293b70a4d5372a3eac0077fd596cf65a3ada6d0fcd1fea8ba27ad039'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582-sda', 'timestamp': '2026-01-26T08:52:03.892009', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '49385110-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.553983611, 'message_signature': 'eea0adf68849d9fe952acdf045c6de6551f24fe86e11de610fb0342e991cebc2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31072256, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db-vda', 'timestamp': '2026-01-26T08:52:03.892009', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '49398468-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.563721226, 'message_signature': '996cc014231c4f502d36320ab3301e3ec9ab77777d12cf99edf665276b698584'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db-sda', 'timestamp': '2026-01-26T08:52:03.892009', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '49398d78-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.563721226, 'message_signature': 'a580a059f8823b52aaa5fc6c7e32756d6177df5fefdf8472281351cfd558e367'}]}, 'timestamp': '2026-01-26 08:52:03.909876', '_unique_id': '7458dbc4f1bb4b29a06609650eb98f96'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.910 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.912 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.912 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.device.write.bytes volume: 73019392 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.912 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.913 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/disk.device.write.bytes volume: 73076736 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.913 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b062fd0-328e-43c8-9450-5d2902e2fd82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73019392, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582-vda', 'timestamp': '2026-01-26T08:52:03.912601', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '493a01ae-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.416348805, 'message_signature': '3855911a668264b6103b95df4a42a8aef0a3a38b067f76541e0e049e4a421005'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582-sda', 'timestamp': '2026-01-26T08:52:03.912601', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '493a0b2c-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.416348805, 'message_signature': '8c3434d3cc29d19691550f1575dad0b1af81f0088fabdca555902f2adc488f01'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73076736, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db-vda', 'timestamp': '2026-01-26T08:52:03.912601', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '493a155e-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.460205493, 'message_signature': 'a06c1a059bb1e58ef7cc789c9243acbda5fa5e89a1742caeaa2c276000905b8f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db-sda', 'timestamp': '2026-01-26T08:52:03.912601', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '493a1de2-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.460205493, 'message_signature': 'e92037f701097b732b4e310f8749d4710db9843c2d98f12cb944e89de260ca93'}]}, 'timestamp': '2026-01-26 08:52:03.913562', '_unique_id': '6687712c50e54204a46a87c9d6441be1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.914 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.916 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 70d16f06-3d0a-454f-a1dd-87ce77ed8582 / tapfb4faeb1-e8 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.916 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/network.outgoing.bytes volume: 6178 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.918 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for eefbc638-e91a-432b-84c3-2448060d96db / tap893a6c54-db inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.918 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/network.outgoing.bytes volume: 21416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '185a7925-a293-4af5-b53b-572f14aa5611', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6178, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000027-70d16f06-3d0a-454f-a1dd-87ce77ed8582-tapfb4faeb1-e8', 'timestamp': '2026-01-26T08:52:03.914901', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'tapfb4faeb1-e8', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:58:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb4faeb1-e8'}, 'message_id': '493aa0c8-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.576885207, 'message_signature': '482f3370451c7245b880220555236b9af2fab23b28a7766d11e5d962ac0cd62b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 21416, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000028-eefbc638-e91a-432b-84c3-2448060d96db-tap893a6c54-db', 'timestamp': '2026-01-26T08:52:03.914901', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'tap893a6c54-db', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:21:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap893a6c54-db'}, 'message_id': '493ae402-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.578873814, 'message_signature': '3bf40d142b3a1acba3ab1fd5c0371c8504fcec486df332d9bcce53a725a2cf7d'}]}, 'timestamp': '2026-01-26 08:52:03.918647', '_unique_id': '97c05fb37ddf42b2819ddbb2a4f2080d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.919 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd8af103-2466-45d2-965f-bebeb93a30d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000027-70d16f06-3d0a-454f-a1dd-87ce77ed8582-tapfb4faeb1-e8', 'timestamp': '2026-01-26T08:52:03.919853', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'tapfb4faeb1-e8', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:58:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb4faeb1-e8'}, 'message_id': '493b1bca-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.576885207, 'message_signature': 'e2c8b7899abeeb41953178d98bb3923f8ed0f35c14949c4f089dbf3323368184'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000028-eefbc638-e91a-432b-84c3-2448060d96db-tap893a6c54-db', 'timestamp': '2026-01-26T08:52:03.919853', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'tap893a6c54-db', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:21:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap893a6c54-db'}, 'message_id': '493b24b2-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.578873814, 'message_signature': '844c89c107355ba84f14321304b4c7feac045ba65eb8c4d3df1ef61b03becf48'}]}, 'timestamp': '2026-01-26 08:52:03.920298', '_unique_id': '0fd63b57d1b44d4f956dbeb41f5b7155'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.920 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.921 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/network.incoming.bytes volume: 7316 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.921 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/network.incoming.bytes volume: 22264 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '876e2bc3-6b92-404b-9b30-cf86b512a838', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7316, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000027-70d16f06-3d0a-454f-a1dd-87ce77ed8582-tapfb4faeb1-e8', 'timestamp': '2026-01-26T08:52:03.921433', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'tapfb4faeb1-e8', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:58:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb4faeb1-e8'}, 'message_id': '493b59c8-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.576885207, 'message_signature': '00201b18e25f14025615344cb3f95cc08952bb30b781191bbc992d3d96b4dceb'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 22264, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000028-eefbc638-e91a-432b-84c3-2448060d96db-tap893a6c54-db', 'timestamp': '2026-01-26T08:52:03.921433', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'tap893a6c54-db', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:21:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap893a6c54-db'}, 'message_id': '493b6242-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.578873814, 'message_signature': 'c50afe79fb3585678f5b123586d5d1ce6ff1a96e93bfa9af748b0912d2238409'}]}, 'timestamp': '2026-01-26 08:52:03.921874', '_unique_id': 'e64dee2cea884daab76c377492fcf790'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a2c29c0-464f-4068-bf08-6b001f0ed996', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000027-70d16f06-3d0a-454f-a1dd-87ce77ed8582-tapfb4faeb1-e8', 'timestamp': '2026-01-26T08:52:03.922998', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'tapfb4faeb1-e8', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:58:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb4faeb1-e8'}, 'message_id': '493b9762-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.576885207, 'message_signature': '4799298916a7408a4234c01dac9c7851cb537d3acc970d7a6f696eaac44c50d2'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000028-eefbc638-e91a-432b-84c3-2448060d96db-tap893a6c54-db', 'timestamp': '2026-01-26T08:52:03.922998', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'tap893a6c54-db', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:21:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap893a6c54-db'}, 'message_id': '493ba0d6-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.578873814, 'message_signature': 'ce089b613d7a9cab7c997d2b2e20635e65d151458f70a284a5b8925fdccba73a'}]}, 'timestamp': '2026-01-26 08:52:03.923478', '_unique_id': '14cb6f1f8ab64bcab196947eb56989e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.923 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.924 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.924 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.device.write.requests volume: 322 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.924 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.925 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/disk.device.write.requests volume: 319 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.925 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f5ba9ee-c830-4c95-97f9-6755590c393f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 322, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582-vda', 'timestamp': '2026-01-26T08:52:03.924676', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '493bd876-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.416348805, 'message_signature': '7eed03eee585c54d2dd3adef0408ed7ea522070be4e8664eaf90a016f416664f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582-sda', 'timestamp': '2026-01-26T08:52:03.924676', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '493be00a-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.416348805, 'message_signature': '525c22e50016e0be6b1170af2dc9f64b490afc2763bfa15faa1ae57d73ea899b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 319, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db-vda', 'timestamp': '2026-01-26T08:52:03.924676', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '493be8ca-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.460205493, 'message_signature': '3bdb6a9564e1a9a8bf8cecf14e334d078cca429da05ab90d94d0a762681c9092'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db-sda', 'timestamp': '2026-01-26T08:52:03.924676', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '493bf130-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.460205493, 'message_signature': 'b3c8fe23e2d6a0f308b5fca1736c2df692f0cb53c35e2ee70b516144f4dc2184'}]}, 'timestamp': '2026-01-26 08:52:03.925524', '_unique_id': '2fd458e2925a4b61879a5b15c6c70932'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.926 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ec0c1c0-44ee-4a10-b85f-be4bb86918af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000027-70d16f06-3d0a-454f-a1dd-87ce77ed8582-tapfb4faeb1-e8', 'timestamp': '2026-01-26T08:52:03.926687', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'tapfb4faeb1-e8', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:58:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb4faeb1-e8'}, 'message_id': '493c2772-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.576885207, 'message_signature': 'd5b142e2ce4ceceb54cc607c81271246234d514135b71e1a6b2ba22aabad61f3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000028-eefbc638-e91a-432b-84c3-2448060d96db-tap893a6c54-db', 'timestamp': '2026-01-26T08:52:03.926687', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'tap893a6c54-db', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:21:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap893a6c54-db'}, 'message_id': '493c3352-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.578873814, 'message_signature': 'cca99480212a5c905acdc19bd7b634a863903e4d69375f35ff4cfea71b403bba'}]}, 'timestamp': '2026-01-26 08:52:03.927256', '_unique_id': '894d94bea85b41c4ac281a0b4b48ea31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.927 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.928 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.928 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.928 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7368b6fa-660d-488f-a4a8-4eeac8fea370', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000027-70d16f06-3d0a-454f-a1dd-87ce77ed8582-tapfb4faeb1-e8', 'timestamp': '2026-01-26T08:52:03.928414', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'tapfb4faeb1-e8', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:58:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb4faeb1-e8'}, 'message_id': '493c6a20-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.576885207, 'message_signature': '4e6a39ce5dfb6924d19586e54748a41645e21eeb018c413ba6c6e87e573450a5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000028-eefbc638-e91a-432b-84c3-2448060d96db-tap893a6c54-db', 'timestamp': '2026-01-26T08:52:03.928414', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'tap893a6c54-db', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:21:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap893a6c54-db'}, 'message_id': '493c722c-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.578873814, 'message_signature': '38265cda08a9a8195731aef4c0578de79abca4de0333e01d584653e06fe466b2'}]}, 'timestamp': '2026-01-26 08:52:03.928835', '_unique_id': '7e85ec18645b4529951fdb5b2d5cde54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.929 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/cpu volume: 10830000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/cpu volume: 11000000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e41e364-3b1e-4a6b-a82c-3be5e07f8404', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10830000000, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'timestamp': '2026-01-26T08:52:03.929924', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '493ca508-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.533874183, 'message_signature': '4eb30227c5bd9a1a4b7499c604a3ae7fd754f97b0ae9e19b64e3f13bf94e5309'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11000000000, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'timestamp': '2026-01-26T08:52:03.929924', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '493cae5e-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.547452787, 'message_signature': 'd1ced60b4d482159be191bf8f8b171e43503339899e7d3959f768352d8bf2db8'}]}, 'timestamp': '2026-01-26 08:52:03.930378', '_unique_id': '63740d832f2c42b79ad2d54bd38b10e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.930 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.931 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.931 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/network.outgoing.packets volume: 50 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.931 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/network.outgoing.packets volume: 152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cde027d-0ee6-47b6-931c-2cf7db72169f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 50, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000027-70d16f06-3d0a-454f-a1dd-87ce77ed8582-tapfb4faeb1-e8', 'timestamp': '2026-01-26T08:52:03.931509', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'tapfb4faeb1-e8', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:58:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb4faeb1-e8'}, 'message_id': '493ce482-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.576885207, 'message_signature': 'dc80964dec05840d2dd1de56d2dcfbba56730251a2ed2d0cc61cb286a6ab0193'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 152, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000028-eefbc638-e91a-432b-84c3-2448060d96db-tap893a6c54-db', 'timestamp': '2026-01-26T08:52:03.931509', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'tap893a6c54-db', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:21:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap893a6c54-db'}, 'message_id': '493cedd8-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.578873814, 'message_signature': 'f42d6752e77fc1c2505fe29620d7d7861c991d2c48532ef64affb041d4fca7ea'}]}, 'timestamp': '2026-01-26 08:52:03.932034', '_unique_id': 'e908c3c1c6e048029195834e1e1417ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.932 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.933 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.933 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.device.read.latency volume: 223618381 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.933 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.device.read.latency volume: 22961937 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.933 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/disk.device.read.latency volume: 268607102 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.933 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/disk.device.read.latency volume: 26593897 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8724baf8-86d8-4e09-aebc-c415f2ba7bea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 223618381, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582-vda', 'timestamp': '2026-01-26T08:52:03.933237', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '493d2780-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.416348805, 'message_signature': '82501830d7cafd01e858a25a0251a4ef6e0cd7de38d376764632c5768657f1b4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22961937, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582-sda', 'timestamp': '2026-01-26T08:52:03.933237', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '493d2f32-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.416348805, 'message_signature': '14b4ddfca0601529147807f46fbe948165ab594e340d400eb1e7bbac88230aef'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 268607102, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db-vda', 'timestamp': '2026-01-26T08:52:03.933237', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '493d36bc-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.460205493, 'message_signature': '743c856c007b649d9ac8dfc44bd267d7b6e9f6e8c0fdae1a3eca07dec383c222'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 26593897, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db-sda', 'timestamp': '2026-01-26T08:52:03.933237', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '493d3e14-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.460205493, 'message_signature': 'b1b2a9548bdbc31cf29cb59742eb257ac556b4a02c79e5a4a299e4127557f92f'}]}, 'timestamp': '2026-01-26 08:52:03.934045', '_unique_id': '1cf7dfa995234fbc930c14b495228b1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.934 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.935 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.935 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.935 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-2078215968>, <NovaLikeServer: tempest-server-test-1408106849>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2078215968>, <NovaLikeServer: tempest-server-test-1408106849>]
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.935 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.935 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.935 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.935 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b628deca-e87b-41d7-b09e-cb085ea2a212', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582-vda', 'timestamp': '2026-01-26T08:52:03.935490', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '493d7e60-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.553983611, 'message_signature': '4aaeb3603b824e66f106c9be1f8a5339094e09f3ca7afcdf88ebf26f02008c3a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582-sda', 'timestamp': '2026-01-26T08:52:03.935490', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '493d8662-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.553983611, 'message_signature': 'd697cf6217a6cb35617707608c8642367bf09c9727ec81f33d0e3c9586bfc5d3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db-vda', 'timestamp': '2026-01-26T08:52:03.935490', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '493d8dd8-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.563721226, 'message_signature': '12f9ecd11912f7052e3d317e992ed647e08e3709d29867d587cca3ade06dd7ab'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db-sda', 'timestamp': '2026-01-26T08:52:03.935490', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '493d963e-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.563721226, 'message_signature': '47a49101f566a799e64157720937329909b3d1f1ba3e657429a9761bc758a3db'}]}, 'timestamp': '2026-01-26 08:52:03.936339', '_unique_id': '97695e6e9098460a8071fd01b31970b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.936 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.937 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.937 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.937 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ae3f673-d173-4688-ba4b-56dc84c5bbe3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000027-70d16f06-3d0a-454f-a1dd-87ce77ed8582-tapfb4faeb1-e8', 'timestamp': '2026-01-26T08:52:03.937581', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'tapfb4faeb1-e8', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:58:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb4faeb1-e8'}, 'message_id': '493dd04a-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.576885207, 'message_signature': '30f34cf304358a1b61fd5555813be41218ebfb7f1b3c87f2be4ab60cb26960e4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000028-eefbc638-e91a-432b-84c3-2448060d96db-tap893a6c54-db', 'timestamp': '2026-01-26T08:52:03.937581', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'tap893a6c54-db', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:21:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap893a6c54-db'}, 'message_id': '493dd842-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.578873814, 'message_signature': '05d02b8088c7617052dcf0f0097a288b87baa88cbfc8629c35b2ec2b34776fa5'}]}, 'timestamp': '2026-01-26 08:52:03.937999', '_unique_id': '30c2c2328a1b4717afd3b5c4470b2463'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.938 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.939 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.939 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.939 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.939 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ce331af-c470-4cde-8a06-5b3438f93d12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582-vda', 'timestamp': '2026-01-26T08:52:03.939063', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '493e0b78-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.553983611, 'message_signature': '69b172486f1c5a8f26b2aaec19a7053b36ee193cd4a33ad762ade4f6229f9b62'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582-sda', 'timestamp': '2026-01-26T08:52:03.939063', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'instance-00000027', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '493e1334-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.553983611, 'message_signature': '85de908a62b0a890b6c31614e170b25faf374b0f7925cf6b37d450a226773b04'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db-vda', 'timestamp': '2026-01-26T08:52:03.939063', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '493e1abe-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.563721226, 'message_signature': 'b709271331ddea92406606509a825bea1400747780c43215651e46ced720bed1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'eefbc638-e91a-432b-84c3-2448060d96db-sda', 'timestamp': '2026-01-26T08:52:03.939063', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'instance-00000028', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '493e21f8-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.563721226, 'message_signature': '0727a1a1cd8fb6fa994203565d8a46b5935eb98c68fec93dcf8558d0e6bddf96'}]}, 'timestamp': '2026-01-26 08:52:03.939878', '_unique_id': 'a81b016fdbeb46de96c7464d4be919c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.940 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.941 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.941 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/network.incoming.packets volume: 42 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.941 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/network.incoming.packets volume: 122 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f722cd86-3996-4646-af4c-cb18bce07e61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 42, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000027-70d16f06-3d0a-454f-a1dd-87ce77ed8582-tapfb4faeb1-e8', 'timestamp': '2026-01-26T08:52:03.941110', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'tapfb4faeb1-e8', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:58:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb4faeb1-e8'}, 'message_id': '493e5ae2-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.576885207, 'message_signature': '9b67de062d7b03142f91b58089b77208b9861cf922137fdc33f4e715c0c85611'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 122, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000028-eefbc638-e91a-432b-84c3-2448060d96db-tap893a6c54-db', 'timestamp': '2026-01-26T08:52:03.941110', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'tap893a6c54-db', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:21:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap893a6c54-db'}, 'message_id': '493e644c-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.578873814, 'message_signature': '239f00ff9e78f214913f798dc29f8801b3a3ba7c072ff7705d9ac2cde092894a'}]}, 'timestamp': '2026-01-26 08:52:03.941610', '_unique_id': '5cfc21d423ff4109b8302bbd6ed74ce4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.942 12 DEBUG ceilometer.compute.pollsters [-] 70d16f06-3d0a-454f-a1dd-87ce77ed8582/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 DEBUG ceilometer.compute.pollsters [-] eefbc638-e91a-432b-84c3-2448060d96db/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '234adf88-6f63-4176-b2e4-6e653889297e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000027-70d16f06-3d0a-454f-a1dd-87ce77ed8582-tapfb4faeb1-e8', 'timestamp': '2026-01-26T08:52:03.942741', 'resource_metadata': {'display_name': 'tempest-server-test-2078215968', 'name': 'tapfb4faeb1-e8', 'instance_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c5:58:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb4faeb1-e8'}, 'message_id': '493e9a98-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.576885207, 'message_signature': '266fc3c09c66b68d9d371a9b3ac007053627528f9eed7da477cb5ba0e8d4a62c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000028-eefbc638-e91a-432b-84c3-2448060d96db-tap893a6c54-db', 'timestamp': '2026-01-26T08:52:03.942741', 'resource_metadata': {'display_name': 'tempest-server-test-1408106849', 'name': 'tap893a6c54-db', 'instance_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:21:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap893a6c54-db'}, 'message_id': '493ea5ba-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3864.578873814, 'message_signature': '20740b10ae4a48aa2c43d645f5be4c3f10ec25da7004fa862f54d65a6fe1dfe5'}]}, 'timestamp': '2026-01-26 08:52:03.943291', '_unique_id': 'c749714159fa441fa0f601384eae5d48'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:52:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:52:03.943 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:52:04 compute-1 nova_compute[183083]: 2026-01-26 08:52:04.374 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Updating instance_info_cache with network_info: [{"id": "fb4faeb1-e827-462b-8f21-36892b978052", "address": "fa:16:3e:c5:58:7b", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb4faeb1-e8", "ovs_interfaceid": "fb4faeb1-e827-462b-8f21-36892b978052", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:52:04 compute-1 nova_compute[183083]: 2026-01-26 08:52:04.388 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Releasing lock "refresh_cache-70d16f06-3d0a-454f-a1dd-87ce77ed8582" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:52:04 compute-1 nova_compute[183083]: 2026-01-26 08:52:04.389 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 08:52:04 compute-1 nova_compute[183083]: 2026-01-26 08:52:04.389 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:52:04 compute-1 nova_compute[183083]: 2026-01-26 08:52:04.390 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:52:04 compute-1 nova_compute[183083]: 2026-01-26 08:52:04.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:52:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:05.304 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:52:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:05.304 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:52:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:05.305 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:52:05 compute-1 nova_compute[183083]: 2026-01-26 08:52:05.641 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:05 compute-1 nova_compute[183083]: 2026-01-26 08:52:05.745 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:06 compute-1 nova_compute[183083]: 2026-01-26 08:52:06.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:52:06 compute-1 nova_compute[183083]: 2026-01-26 08:52:06.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:52:06 compute-1 nova_compute[183083]: 2026-01-26 08:52:06.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:52:06 compute-1 nova_compute[183083]: 2026-01-26 08:52:06.953 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:52:06 compute-1 nova_compute[183083]: 2026-01-26 08:52:06.983 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:52:06 compute-1 nova_compute[183083]: 2026-01-26 08:52:06.983 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:52:06 compute-1 nova_compute[183083]: 2026-01-26 08:52:06.984 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:52:06 compute-1 nova_compute[183083]: 2026-01-26 08:52:06.984 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.070 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.127 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.129 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.215 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.224 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.276 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.278 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.354 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.564 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.566 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13452MB free_disk=113.03469467163086GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.567 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.567 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.720 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance 70d16f06-3d0a-454f-a1dd-87ce77ed8582 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.721 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance eefbc638-e91a-432b-84c3-2448060d96db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.722 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.722 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=768MB phys_disk=119GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.855 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.873 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.894 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:52:07 compute-1 nova_compute[183083]: 2026-01-26 08:52:07.895 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:52:08 compute-1 nova_compute[183083]: 2026-01-26 08:52:08.895 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:52:08 compute-1 nova_compute[183083]: 2026-01-26 08:52:08.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:52:10 compute-1 nova_compute[183083]: 2026-01-26 08:52:10.698 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:10 compute-1 nova_compute[183083]: 2026-01-26 08:52:10.746 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:13 compute-1 podman[216814]: 2026-01-26 08:52:13.828432969 +0000 UTC m=+0.081706797 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 08:52:15 compute-1 nova_compute[183083]: 2026-01-26 08:52:15.751 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:52:15 compute-1 nova_compute[183083]: 2026-01-26 08:52:15.753 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:15 compute-1 nova_compute[183083]: 2026-01-26 08:52:15.753 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5006 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 08:52:15 compute-1 nova_compute[183083]: 2026-01-26 08:52:15.754 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 08:52:15 compute-1 nova_compute[183083]: 2026-01-26 08:52:15.754 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 08:52:15 compute-1 nova_compute[183083]: 2026-01-26 08:52:15.757 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:16.324 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:52:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:16.326 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:52:16 compute-1 nova_compute[183083]: 2026-01-26 08:52:16.325 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:17.328 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:52:20 compute-1 nova_compute[183083]: 2026-01-26 08:52:20.802 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:25 compute-1 nova_compute[183083]: 2026-01-26 08:52:25.805 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:28 compute-1 podman[216839]: 2026-01-26 08:52:28.833717465 +0000 UTC m=+0.096211647 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, config_id=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, name=ubi9-minimal, build-date=2025-08-20T13:12:41)
Jan 26 08:52:28 compute-1 podman[216838]: 2026-01-26 08:52:28.867269102 +0000 UTC m=+0.129378713 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Jan 26 08:52:30 compute-1 nova_compute[183083]: 2026-01-26 08:52:30.807 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:52:30 compute-1 nova_compute[183083]: 2026-01-26 08:52:30.808 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:30 compute-1 nova_compute[183083]: 2026-01-26 08:52:30.809 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 08:52:30 compute-1 nova_compute[183083]: 2026-01-26 08:52:30.809 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 08:52:30 compute-1 nova_compute[183083]: 2026-01-26 08:52:30.810 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 08:52:30 compute-1 nova_compute[183083]: 2026-01-26 08:52:30.812 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:34 compute-1 podman[216883]: 2026-01-26 08:52:34.831172776 +0000 UTC m=+0.076285264 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 26 08:52:34 compute-1 podman[216884]: 2026-01-26 08:52:34.83840113 +0000 UTC m=+0.080228265 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 08:52:34 compute-1 podman[216882]: 2026-01-26 08:52:34.879095299 +0000 UTC m=+0.130516245 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:52:35 compute-1 nova_compute[183083]: 2026-01-26 08:52:35.810 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:35 compute-1 nova_compute[183083]: 2026-01-26 08:52:35.812 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:40 compute-1 nova_compute[183083]: 2026-01-26 08:52:40.812 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:44 compute-1 podman[216949]: 2026-01-26 08:52:44.808050408 +0000 UTC m=+0.063642008 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 08:52:45 compute-1 nova_compute[183083]: 2026-01-26 08:52:45.818 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:52:45 compute-1 nova_compute[183083]: 2026-01-26 08:52:45.819 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:52:45 compute-1 nova_compute[183083]: 2026-01-26 08:52:45.820 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 08:52:45 compute-1 nova_compute[183083]: 2026-01-26 08:52:45.820 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 08:52:45 compute-1 nova_compute[183083]: 2026-01-26 08:52:45.860 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:45 compute-1 nova_compute[183083]: 2026-01-26 08:52:45.861 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 08:52:49 compute-1 sshd-session[216973]: Invalid user sol from 2.57.122.238 port 43926
Jan 26 08:52:49 compute-1 sshd-session[216973]: Connection closed by invalid user sol 2.57.122.238 port 43926 [preauth]
Jan 26 08:52:50 compute-1 nova_compute[183083]: 2026-01-26 08:52:50.862 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:52:50 compute-1 nova_compute[183083]: 2026-01-26 08:52:50.900 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:52:50 compute-1 nova_compute[183083]: 2026-01-26 08:52:50.901 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5040 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 08:52:50 compute-1 nova_compute[183083]: 2026-01-26 08:52:50.901 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 08:52:50 compute-1 nova_compute[183083]: 2026-01-26 08:52:50.902 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 08:52:50 compute-1 nova_compute[183083]: 2026-01-26 08:52:50.903 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:50 compute-1 sshd-session[216975]: Connection closed by authenticating user root 159.223.236.81 port 55522 [preauth]
Jan 26 08:52:55 compute-1 nova_compute[183083]: 2026-01-26 08:52:55.903 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.012 183087 DEBUG oslo_concurrency.lockutils [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "eefbc638-e91a-432b-84c3-2448060d96db" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.013 183087 DEBUG oslo_concurrency.lockutils [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "eefbc638-e91a-432b-84c3-2448060d96db" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.013 183087 DEBUG oslo_concurrency.lockutils [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "eefbc638-e91a-432b-84c3-2448060d96db-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.013 183087 DEBUG oslo_concurrency.lockutils [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "eefbc638-e91a-432b-84c3-2448060d96db-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.013 183087 DEBUG oslo_concurrency.lockutils [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "eefbc638-e91a-432b-84c3-2448060d96db-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.015 183087 INFO nova.compute.manager [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Terminating instance
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.016 183087 DEBUG nova.compute.manager [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:52:58 compute-1 kernel: tap893a6c54-db (unregistering): left promiscuous mode
Jan 26 08:52:58 compute-1 NetworkManager[55451]: <info>  [1769417578.0503] device (tap893a6c54-db): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 08:52:58 compute-1 ovn_controller[95352]: 2026-01-26T08:52:58Z|00200|binding|INFO|Releasing lport 893a6c54-db8b-4ec4-93c3-99b76af58008 from this chassis (sb_readonly=0)
Jan 26 08:52:58 compute-1 ovn_controller[95352]: 2026-01-26T08:52:58Z|00201|binding|INFO|Setting lport 893a6c54-db8b-4ec4-93c3-99b76af58008 down in Southbound
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.109 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:58 compute-1 ovn_controller[95352]: 2026-01-26T08:52:58Z|00202|binding|INFO|Removing iface tap893a6c54-db ovn-installed in OVS
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.112 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.122 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:58.132 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:21:03 10.100.0.11'], port_security=['fa:16:3e:31:21:03 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'eefbc638-e91a-432b-84c3-2448060d96db', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cf9da6cb-556c-4ee2-9313-239643d15cb1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.235'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97b37269-bb63-498a-80e2-0f1154aea97c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=893a6c54-db8b-4ec4-93c3-99b76af58008) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:52:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:58.134 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 893a6c54-db8b-4ec4-93c3-99b76af58008 in datapath ce3cd186-bdaf-40d4-a276-e9139fe3dfec unbound from our chassis
Jan 26 08:52:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:58.135 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ce3cd186-bdaf-40d4-a276-e9139fe3dfec
Jan 26 08:52:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:58.165 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[655ac095-e05b-4961-9fde-465cc0b25ca8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:52:58 compute-1 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000028.scope: Deactivated successfully.
Jan 26 08:52:58 compute-1 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000028.scope: Consumed 16.954s CPU time.
Jan 26 08:52:58 compute-1 systemd-machined[154360]: Machine qemu-11-instance-00000028 terminated.
Jan 26 08:52:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:58.203 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[f32d7b07-2f5f-461d-b8a3-dac5bed85cbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:52:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:58.206 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[97923d2e-137c-4a9f-ac4e-27ee63f8ffb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.239 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:58.238 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[3b884766-e32f-45f0-acb4-0aad0aff73bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.244 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:58.258 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[4bdc165c-e886-4c90-bb93-a5408ace7074]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce3cd186-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:90:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379418, 'reachable_time': 30482, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217006, 'error': None, 'target': 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:52:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:58.279 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b863994e-942b-4b1c-b2cf-5de28aced26d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapce3cd186-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379434, 'tstamp': 379434}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217014, 'error': None, 'target': 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapce3cd186-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379439, 'tstamp': 379439}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217014, 'error': None, 'target': 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:52:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:58.280 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce3cd186-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.281 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.287 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:58.287 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce3cd186-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:52:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:58.287 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:52:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:58.288 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapce3cd186-b0, col_values=(('external_ids', {'iface-id': '597f7eff-a379-4a5d-bffc-0e294ae89e61'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:52:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:52:58.288 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.291 183087 INFO nova.virt.libvirt.driver [-] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Instance destroyed successfully.
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.292 183087 DEBUG nova.objects.instance [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lazy-loading 'resources' on Instance uuid eefbc638-e91a-432b-84c3-2448060d96db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.313 183087 DEBUG nova.virt.libvirt.vif [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T08:51:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1408106849',display_name='tempest-server-test-1408106849',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1408106849',id=40,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDY4V6+HgiJjlBp+sismnH//xIhvaHIyZ883PC4cnliN7XgVilpp1kBxLjUHPWqW57B2esyV5Ub1B2t4CQ7Ool6UJEN5MlISleL6j0D4sAsLrzbr7gNivUWbODLUMAl3Sw==',key_name='tempest-keypair-test-1986375941',keypairs=<?>,launch_index=0,launched_at=2026-01-26T08:51:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5d0c78b7cd584e4a90592d8ea01ce4ad',ramdisk_id='',reservation_id='r-tno0birl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkDefaultSecGroupTest-1876093813',owner_user_name='tempest-NetworkDefaultSecGroupTest-1876093813-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T08:51:16Z,user_data=None,user_id='988ebc31182f4c94813f94306e399a2d',uuid=eefbc638-e91a-432b-84c3-2448060d96db,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "893a6c54-db8b-4ec4-93c3-99b76af58008", "address": "fa:16:3e:31:21:03", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap893a6c54-db", "ovs_interfaceid": "893a6c54-db8b-4ec4-93c3-99b76af58008", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.313 183087 DEBUG nova.network.os_vif_util [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converting VIF {"id": "893a6c54-db8b-4ec4-93c3-99b76af58008", "address": "fa:16:3e:31:21:03", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap893a6c54-db", "ovs_interfaceid": "893a6c54-db8b-4ec4-93c3-99b76af58008", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.314 183087 DEBUG nova.network.os_vif_util [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:31:21:03,bridge_name='br-int',has_traffic_filtering=True,id=893a6c54-db8b-4ec4-93c3-99b76af58008,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap893a6c54-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.314 183087 DEBUG os_vif [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:31:21:03,bridge_name='br-int',has_traffic_filtering=True,id=893a6c54-db8b-4ec4-93c3-99b76af58008,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap893a6c54-db') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.315 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.316 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap893a6c54-db, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.317 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.318 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.320 183087 INFO os_vif [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:31:21:03,bridge_name='br-int',has_traffic_filtering=True,id=893a6c54-db8b-4ec4-93c3-99b76af58008,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap893a6c54-db')
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.321 183087 INFO nova.virt.libvirt.driver [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Deleting instance files /var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db_del
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.321 183087 INFO nova.virt.libvirt.driver [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Deletion of /var/lib/nova/instances/eefbc638-e91a-432b-84c3-2448060d96db_del complete
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.597 183087 INFO nova.compute.manager [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Took 0.58 seconds to destroy the instance on the hypervisor.
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.598 183087 DEBUG oslo.service.loopingcall [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.598 183087 DEBUG nova.compute.manager [-] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.599 183087 DEBUG nova.network.neutron [-] [instance: eefbc638-e91a-432b-84c3-2448060d96db] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.697 183087 DEBUG nova.compute.manager [req-113cc3f9-47ee-4c5c-a9ac-a06fadedc814 req-df704c2a-d8e5-4d91-a280-9652dc92aa2d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Received event network-vif-unplugged-893a6c54-db8b-4ec4-93c3-99b76af58008 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.697 183087 DEBUG oslo_concurrency.lockutils [req-113cc3f9-47ee-4c5c-a9ac-a06fadedc814 req-df704c2a-d8e5-4d91-a280-9652dc92aa2d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "eefbc638-e91a-432b-84c3-2448060d96db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.698 183087 DEBUG oslo_concurrency.lockutils [req-113cc3f9-47ee-4c5c-a9ac-a06fadedc814 req-df704c2a-d8e5-4d91-a280-9652dc92aa2d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "eefbc638-e91a-432b-84c3-2448060d96db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.698 183087 DEBUG oslo_concurrency.lockutils [req-113cc3f9-47ee-4c5c-a9ac-a06fadedc814 req-df704c2a-d8e5-4d91-a280-9652dc92aa2d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "eefbc638-e91a-432b-84c3-2448060d96db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.698 183087 DEBUG nova.compute.manager [req-113cc3f9-47ee-4c5c-a9ac-a06fadedc814 req-df704c2a-d8e5-4d91-a280-9652dc92aa2d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] No waiting events found dispatching network-vif-unplugged-893a6c54-db8b-4ec4-93c3-99b76af58008 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:52:58 compute-1 nova_compute[183083]: 2026-01-26 08:52:58.698 183087 DEBUG nova.compute.manager [req-113cc3f9-47ee-4c5c-a9ac-a06fadedc814 req-df704c2a-d8e5-4d91-a280-9652dc92aa2d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Received event network-vif-unplugged-893a6c54-db8b-4ec4-93c3-99b76af58008 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 08:52:59 compute-1 podman[217021]: 2026-01-26 08:52:59.840769438 +0000 UTC m=+0.089921809 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, managed_by=edpm_ansible, config_id=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9)
Jan 26 08:52:59 compute-1 podman[217020]: 2026-01-26 08:52:59.864038465 +0000 UTC m=+0.114067691 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 08:53:00 compute-1 nova_compute[183083]: 2026-01-26 08:53:00.270 183087 DEBUG nova.network.neutron [-] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:53:00 compute-1 nova_compute[183083]: 2026-01-26 08:53:00.290 183087 INFO nova.compute.manager [-] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Took 1.69 seconds to deallocate network for instance.
Jan 26 08:53:00 compute-1 nova_compute[183083]: 2026-01-26 08:53:00.331 183087 DEBUG oslo_concurrency.lockutils [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:00 compute-1 nova_compute[183083]: 2026-01-26 08:53:00.331 183087 DEBUG oslo_concurrency.lockutils [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:00 compute-1 nova_compute[183083]: 2026-01-26 08:53:00.423 183087 DEBUG nova.compute.provider_tree [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:53:00 compute-1 nova_compute[183083]: 2026-01-26 08:53:00.438 183087 DEBUG nova.scheduler.client.report [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:53:00 compute-1 nova_compute[183083]: 2026-01-26 08:53:00.464 183087 DEBUG oslo_concurrency.lockutils [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:00 compute-1 nova_compute[183083]: 2026-01-26 08:53:00.495 183087 INFO nova.scheduler.client.report [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Deleted allocations for instance eefbc638-e91a-432b-84c3-2448060d96db
Jan 26 08:53:00 compute-1 nova_compute[183083]: 2026-01-26 08:53:00.568 183087 DEBUG oslo_concurrency.lockutils [None req-00d3dc45-d933-4e03-8c83-01e183a5bfd4 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "eefbc638-e91a-432b-84c3-2448060d96db" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.555s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:00 compute-1 nova_compute[183083]: 2026-01-26 08:53:00.785 183087 DEBUG nova.compute.manager [req-6d442b1f-908c-40f7-9791-ecc6bdaa1f91 req-e8e2397a-451d-471e-9142-fb071422c77f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Received event network-vif-plugged-893a6c54-db8b-4ec4-93c3-99b76af58008 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:53:00 compute-1 nova_compute[183083]: 2026-01-26 08:53:00.785 183087 DEBUG oslo_concurrency.lockutils [req-6d442b1f-908c-40f7-9791-ecc6bdaa1f91 req-e8e2397a-451d-471e-9142-fb071422c77f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "eefbc638-e91a-432b-84c3-2448060d96db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:00 compute-1 nova_compute[183083]: 2026-01-26 08:53:00.786 183087 DEBUG oslo_concurrency.lockutils [req-6d442b1f-908c-40f7-9791-ecc6bdaa1f91 req-e8e2397a-451d-471e-9142-fb071422c77f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "eefbc638-e91a-432b-84c3-2448060d96db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:00 compute-1 nova_compute[183083]: 2026-01-26 08:53:00.786 183087 DEBUG oslo_concurrency.lockutils [req-6d442b1f-908c-40f7-9791-ecc6bdaa1f91 req-e8e2397a-451d-471e-9142-fb071422c77f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "eefbc638-e91a-432b-84c3-2448060d96db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:00 compute-1 nova_compute[183083]: 2026-01-26 08:53:00.786 183087 DEBUG nova.compute.manager [req-6d442b1f-908c-40f7-9791-ecc6bdaa1f91 req-e8e2397a-451d-471e-9142-fb071422c77f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] No waiting events found dispatching network-vif-plugged-893a6c54-db8b-4ec4-93c3-99b76af58008 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:53:00 compute-1 nova_compute[183083]: 2026-01-26 08:53:00.786 183087 WARNING nova.compute.manager [req-6d442b1f-908c-40f7-9791-ecc6bdaa1f91 req-e8e2397a-451d-471e-9142-fb071422c77f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Received unexpected event network-vif-plugged-893a6c54-db8b-4ec4-93c3-99b76af58008 for instance with vm_state deleted and task_state None.
Jan 26 08:53:00 compute-1 nova_compute[183083]: 2026-01-26 08:53:00.786 183087 DEBUG nova.compute.manager [req-6d442b1f-908c-40f7-9791-ecc6bdaa1f91 req-e8e2397a-451d-471e-9142-fb071422c77f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Received event network-vif-deleted-893a6c54-db8b-4ec4-93c3-99b76af58008 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:53:00 compute-1 nova_compute[183083]: 2026-01-26 08:53:00.906 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.516 183087 DEBUG oslo_concurrency.lockutils [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.517 183087 DEBUG oslo_concurrency.lockutils [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.517 183087 DEBUG oslo_concurrency.lockutils [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.517 183087 DEBUG oslo_concurrency.lockutils [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.517 183087 DEBUG oslo_concurrency.lockutils [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.519 183087 INFO nova.compute.manager [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Terminating instance
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.520 183087 DEBUG nova.compute.manager [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:53:01 compute-1 kernel: tapfb4faeb1-e8 (unregistering): left promiscuous mode
Jan 26 08:53:01 compute-1 NetworkManager[55451]: <info>  [1769417581.5503] device (tapfb4faeb1-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.557 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:01 compute-1 ovn_controller[95352]: 2026-01-26T08:53:01Z|00203|binding|INFO|Releasing lport fb4faeb1-e827-462b-8f21-36892b978052 from this chassis (sb_readonly=0)
Jan 26 08:53:01 compute-1 ovn_controller[95352]: 2026-01-26T08:53:01Z|00204|binding|INFO|Setting lport fb4faeb1-e827-462b-8f21-36892b978052 down in Southbound
Jan 26 08:53:01 compute-1 ovn_controller[95352]: 2026-01-26T08:53:01Z|00205|binding|INFO|Removing iface tapfb4faeb1-e8 ovn-installed in OVS
Jan 26 08:53:01 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:01.567 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:58:7b 10.100.0.6'], port_security=['fa:16:3e:c5:58:7b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '70d16f06-3d0a-454f-a1dd-87ce77ed8582', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cf9da6cb-556c-4ee2-9313-239643d15cb1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.247'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97b37269-bb63-498a-80e2-0f1154aea97c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=fb4faeb1-e827-462b-8f21-36892b978052) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:53:01 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:01.568 104632 INFO neutron.agent.ovn.metadata.agent [-] Port fb4faeb1-e827-462b-8f21-36892b978052 in datapath ce3cd186-bdaf-40d4-a276-e9139fe3dfec unbound from our chassis
Jan 26 08:53:01 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:01.569 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ce3cd186-bdaf-40d4-a276-e9139fe3dfec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 08:53:01 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:01.570 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[7d98ca02-4216-4bc4-9721-b653af7a4f30]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:01 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:01.571 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec namespace which is not needed anymore
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.586 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:01 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000027.scope: Deactivated successfully.
Jan 26 08:53:01 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000027.scope: Consumed 17.711s CPU time.
Jan 26 08:53:01 compute-1 systemd-machined[154360]: Machine qemu-10-instance-00000027 terminated.
Jan 26 08:53:01 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[216320]: [NOTICE]   (216324) : haproxy version is 2.8.14-c23fe91
Jan 26 08:53:01 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[216320]: [NOTICE]   (216324) : path to executable is /usr/sbin/haproxy
Jan 26 08:53:01 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[216320]: [WARNING]  (216324) : Exiting Master process...
Jan 26 08:53:01 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[216320]: [ALERT]    (216324) : Current worker (216326) exited with code 143 (Terminated)
Jan 26 08:53:01 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[216320]: [WARNING]  (216324) : All workers exited. Exiting... (0)
Jan 26 08:53:01 compute-1 systemd[1]: libpod-2adedd02399f28e01ae990c221c197c77873a6fca76921bc936e6a053ea62ffe.scope: Deactivated successfully.
Jan 26 08:53:01 compute-1 podman[217085]: 2026-01-26 08:53:01.698222866 +0000 UTC m=+0.045310940 container died 2adedd02399f28e01ae990c221c197c77873a6fca76921bc936e6a053ea62ffe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 26 08:53:01 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2adedd02399f28e01ae990c221c197c77873a6fca76921bc936e6a053ea62ffe-userdata-shm.mount: Deactivated successfully.
Jan 26 08:53:01 compute-1 systemd[1]: var-lib-containers-storage-overlay-b18b92d931024c0e41f28df4d171744df1d0481ca524aacd5261335c74f4e9d7-merged.mount: Deactivated successfully.
Jan 26 08:53:01 compute-1 podman[217085]: 2026-01-26 08:53:01.730033444 +0000 UTC m=+0.077121498 container cleanup 2adedd02399f28e01ae990c221c197c77873a6fca76921bc936e6a053ea62ffe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 08:53:01 compute-1 systemd[1]: libpod-conmon-2adedd02399f28e01ae990c221c197c77873a6fca76921bc936e6a053ea62ffe.scope: Deactivated successfully.
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.776 183087 INFO nova.virt.libvirt.driver [-] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Instance destroyed successfully.
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.778 183087 DEBUG nova.objects.instance [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lazy-loading 'resources' on Instance uuid 70d16f06-3d0a-454f-a1dd-87ce77ed8582 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:53:01 compute-1 podman[217116]: 2026-01-26 08:53:01.795100181 +0000 UTC m=+0.045550117 container remove 2adedd02399f28e01ae990c221c197c77873a6fca76921bc936e6a053ea62ffe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.796 183087 DEBUG nova.virt.libvirt.vif [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T08:50:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-2078215968',display_name='tempest-server-test-2078215968',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-2078215968',id=39,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDY4V6+HgiJjlBp+sismnH//xIhvaHIyZ883PC4cnliN7XgVilpp1kBxLjUHPWqW57B2esyV5Ub1B2t4CQ7Ool6UJEN5MlISleL6j0D4sAsLrzbr7gNivUWbODLUMAl3Sw==',key_name='tempest-keypair-test-1986375941',keypairs=<?>,launch_index=0,launched_at=2026-01-26T08:50:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5d0c78b7cd584e4a90592d8ea01ce4ad',ramdisk_id='',reservation_id='r-o7dkh8qr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkDefaultSecGroupTest-1876093813',owner_user_name='tempest-NetworkDefaultSecGroupTest-1876093813-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T08:50:53Z,user_data=None,user_id='988ebc31182f4c94813f94306e399a2d',uuid=70d16f06-3d0a-454f-a1dd-87ce77ed8582,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fb4faeb1-e827-462b-8f21-36892b978052", "address": "fa:16:3e:c5:58:7b", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb4faeb1-e8", "ovs_interfaceid": "fb4faeb1-e827-462b-8f21-36892b978052", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.796 183087 DEBUG nova.network.os_vif_util [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converting VIF {"id": "fb4faeb1-e827-462b-8f21-36892b978052", "address": "fa:16:3e:c5:58:7b", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb4faeb1-e8", "ovs_interfaceid": "fb4faeb1-e827-462b-8f21-36892b978052", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.797 183087 DEBUG nova.network.os_vif_util [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c5:58:7b,bridge_name='br-int',has_traffic_filtering=True,id=fb4faeb1-e827-462b-8f21-36892b978052,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb4faeb1-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.797 183087 DEBUG os_vif [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c5:58:7b,bridge_name='br-int',has_traffic_filtering=True,id=fb4faeb1-e827-462b-8f21-36892b978052,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb4faeb1-e8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.799 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.799 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb4faeb1-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:53:01 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:01.799 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[3a67933c-78a9-441e-8298-5bbc11004550]: (4, ('Mon Jan 26 08:53:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec (2adedd02399f28e01ae990c221c197c77873a6fca76921bc936e6a053ea62ffe)\n2adedd02399f28e01ae990c221c197c77873a6fca76921bc936e6a053ea62ffe\nMon Jan 26 08:53:01 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec (2adedd02399f28e01ae990c221c197c77873a6fca76921bc936e6a053ea62ffe)\n2adedd02399f28e01ae990c221c197c77873a6fca76921bc936e6a053ea62ffe\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:01 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:01.801 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[71558ef2-d078-45db-8040-2c5d13ea5d1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:01 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:01.802 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce3cd186-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.866 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:01 compute-1 kernel: tapce3cd186-b0: left promiscuous mode
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.869 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.881 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:01 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:01.884 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[46d1ba07-656f-428e-a6a6-0b1591cc578e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.885 183087 INFO os_vif [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c5:58:7b,bridge_name='br-int',has_traffic_filtering=True,id=fb4faeb1-e827-462b-8f21-36892b978052,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb4faeb1-e8')
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.886 183087 INFO nova.virt.libvirt.driver [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Deleting instance files /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582_del
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.887 183087 INFO nova.virt.libvirt.driver [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Deletion of /var/lib/nova/instances/70d16f06-3d0a-454f-a1dd-87ce77ed8582_del complete
Jan 26 08:53:01 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:01.901 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[57e831c4-ac62-409c-84a5-105e32bdb7b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:01 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:01.902 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[f68c66a1-abcb-4be7-b452-d9158ff74c67]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:01 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:01.915 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[394edb78-5829-4e2f-855b-cd20e9820c53]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379410, 'reachable_time': 18479, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217147, 'error': None, 'target': 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:01 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:01.917 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 08:53:01 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:01.917 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[acb9fd0d-31db-4473-be3f-f6a62e498599]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:01 compute-1 systemd[1]: run-netns-ovnmeta\x2dce3cd186\x2dbdaf\x2d40d4\x2da276\x2de9139fe3dfec.mount: Deactivated successfully.
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.948 183087 INFO nova.compute.manager [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Took 0.43 seconds to destroy the instance on the hypervisor.
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.948 183087 DEBUG oslo.service.loopingcall [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.949 183087 DEBUG nova.compute.manager [-] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:53:01 compute-1 nova_compute[183083]: 2026-01-26 08:53:01.949 183087 DEBUG nova.network.neutron [-] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:53:02 compute-1 nova_compute[183083]: 2026-01-26 08:53:02.965 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:53:02 compute-1 nova_compute[183083]: 2026-01-26 08:53:02.966 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:53:03 compute-1 nova_compute[183083]: 2026-01-26 08:53:03.024 183087 DEBUG nova.compute.manager [req-560edc6d-73d6-4f57-8dae-14b291e510bb req-c57bc104-70c5-4203-9483-62700f00c856 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Received event network-vif-unplugged-fb4faeb1-e827-462b-8f21-36892b978052 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:53:03 compute-1 nova_compute[183083]: 2026-01-26 08:53:03.025 183087 DEBUG oslo_concurrency.lockutils [req-560edc6d-73d6-4f57-8dae-14b291e510bb req-c57bc104-70c5-4203-9483-62700f00c856 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:03 compute-1 nova_compute[183083]: 2026-01-26 08:53:03.025 183087 DEBUG oslo_concurrency.lockutils [req-560edc6d-73d6-4f57-8dae-14b291e510bb req-c57bc104-70c5-4203-9483-62700f00c856 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:03 compute-1 nova_compute[183083]: 2026-01-26 08:53:03.026 183087 DEBUG oslo_concurrency.lockutils [req-560edc6d-73d6-4f57-8dae-14b291e510bb req-c57bc104-70c5-4203-9483-62700f00c856 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:03 compute-1 nova_compute[183083]: 2026-01-26 08:53:03.026 183087 DEBUG nova.compute.manager [req-560edc6d-73d6-4f57-8dae-14b291e510bb req-c57bc104-70c5-4203-9483-62700f00c856 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] No waiting events found dispatching network-vif-unplugged-fb4faeb1-e827-462b-8f21-36892b978052 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:53:03 compute-1 nova_compute[183083]: 2026-01-26 08:53:03.027 183087 DEBUG nova.compute.manager [req-560edc6d-73d6-4f57-8dae-14b291e510bb req-c57bc104-70c5-4203-9483-62700f00c856 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Received event network-vif-unplugged-fb4faeb1-e827-462b-8f21-36892b978052 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 08:53:03 compute-1 nova_compute[183083]: 2026-01-26 08:53:03.153 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 08:53:03 compute-1 nova_compute[183083]: 2026-01-26 08:53:03.403 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:03.405 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:53:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:03.407 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:53:03 compute-1 nova_compute[183083]: 2026-01-26 08:53:03.415 183087 DEBUG nova.network.neutron [-] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:53:03 compute-1 nova_compute[183083]: 2026-01-26 08:53:03.428 183087 INFO nova.compute.manager [-] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Took 1.48 seconds to deallocate network for instance.
Jan 26 08:53:03 compute-1 nova_compute[183083]: 2026-01-26 08:53:03.556 183087 DEBUG oslo_concurrency.lockutils [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:03 compute-1 nova_compute[183083]: 2026-01-26 08:53:03.557 183087 DEBUG oslo_concurrency.lockutils [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:03 compute-1 nova_compute[183083]: 2026-01-26 08:53:03.594 183087 DEBUG nova.compute.provider_tree [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:53:03 compute-1 nova_compute[183083]: 2026-01-26 08:53:03.607 183087 DEBUG nova.scheduler.client.report [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:53:03 compute-1 nova_compute[183083]: 2026-01-26 08:53:03.625 183087 DEBUG oslo_concurrency.lockutils [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:03 compute-1 nova_compute[183083]: 2026-01-26 08:53:03.645 183087 INFO nova.scheduler.client.report [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Deleted allocations for instance 70d16f06-3d0a-454f-a1dd-87ce77ed8582
Jan 26 08:53:03 compute-1 nova_compute[183083]: 2026-01-26 08:53:03.740 183087 DEBUG oslo_concurrency.lockutils [None req-ebd6606d-6248-4e85-82e0-0fdbc6dff6ea 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:04 compute-1 nova_compute[183083]: 2026-01-26 08:53:04.134 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:53:04 compute-1 nova_compute[183083]: 2026-01-26 08:53:04.134 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:53:04 compute-1 nova_compute[183083]: 2026-01-26 08:53:04.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:53:04 compute-1 nova_compute[183083]: 2026-01-26 08:53:04.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:53:05 compute-1 nova_compute[183083]: 2026-01-26 08:53:05.094 183087 DEBUG nova.compute.manager [req-29b53ee5-ccc6-4d1a-b281-0f4357779746 req-3b21e354-02fe-4cd7-b121-2b88274caa13 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Received event network-vif-plugged-fb4faeb1-e827-462b-8f21-36892b978052 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:53:05 compute-1 nova_compute[183083]: 2026-01-26 08:53:05.094 183087 DEBUG oslo_concurrency.lockutils [req-29b53ee5-ccc6-4d1a-b281-0f4357779746 req-3b21e354-02fe-4cd7-b121-2b88274caa13 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:05 compute-1 nova_compute[183083]: 2026-01-26 08:53:05.094 183087 DEBUG oslo_concurrency.lockutils [req-29b53ee5-ccc6-4d1a-b281-0f4357779746 req-3b21e354-02fe-4cd7-b121-2b88274caa13 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:05 compute-1 nova_compute[183083]: 2026-01-26 08:53:05.094 183087 DEBUG oslo_concurrency.lockutils [req-29b53ee5-ccc6-4d1a-b281-0f4357779746 req-3b21e354-02fe-4cd7-b121-2b88274caa13 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "70d16f06-3d0a-454f-a1dd-87ce77ed8582-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:05 compute-1 nova_compute[183083]: 2026-01-26 08:53:05.094 183087 DEBUG nova.compute.manager [req-29b53ee5-ccc6-4d1a-b281-0f4357779746 req-3b21e354-02fe-4cd7-b121-2b88274caa13 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] No waiting events found dispatching network-vif-plugged-fb4faeb1-e827-462b-8f21-36892b978052 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:53:05 compute-1 nova_compute[183083]: 2026-01-26 08:53:05.095 183087 WARNING nova.compute.manager [req-29b53ee5-ccc6-4d1a-b281-0f4357779746 req-3b21e354-02fe-4cd7-b121-2b88274caa13 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Received unexpected event network-vif-plugged-fb4faeb1-e827-462b-8f21-36892b978052 for instance with vm_state deleted and task_state None.
Jan 26 08:53:05 compute-1 nova_compute[183083]: 2026-01-26 08:53:05.095 183087 DEBUG nova.compute.manager [req-29b53ee5-ccc6-4d1a-b281-0f4357779746 req-3b21e354-02fe-4cd7-b121-2b88274caa13 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Received event network-vif-deleted-fb4faeb1-e827-462b-8f21-36892b978052 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:53:05 compute-1 ovn_controller[95352]: 2026-01-26T08:53:05Z|00206|pinctrl|WARN|Dropped 263 log messages in last 61 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 26 08:53:05 compute-1 ovn_controller[95352]: 2026-01-26T08:53:05Z|00207|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:53:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:05.304 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:05.305 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:05.305 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:05 compute-1 podman[217149]: 2026-01-26 08:53:05.8081337 +0000 UTC m=+0.066442857 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 08:53:05 compute-1 podman[217150]: 2026-01-26 08:53:05.825763648 +0000 UTC m=+0.075925275 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 08:53:05 compute-1 podman[217148]: 2026-01-26 08:53:05.890571527 +0000 UTC m=+0.151826787 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:53:05 compute-1 nova_compute[183083]: 2026-01-26 08:53:05.907 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:05 compute-1 nova_compute[183083]: 2026-01-26 08:53:05.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:53:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:06.409 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:53:06 compute-1 nova_compute[183083]: 2026-01-26 08:53:06.868 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:06 compute-1 nova_compute[183083]: 2026-01-26 08:53:06.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:53:06 compute-1 nova_compute[183083]: 2026-01-26 08:53:06.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.512 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.513 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.529 183087 DEBUG nova.compute.manager [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.600 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.601 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.608 183087 DEBUG nova.virt.hardware [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.609 183087 INFO nova.compute.claims [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.725 183087 DEBUG nova.compute.provider_tree [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.738 183087 DEBUG nova.scheduler.client.report [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.762 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.762 183087 DEBUG nova.compute.manager [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.814 183087 DEBUG nova.compute.manager [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.815 183087 DEBUG nova.network.neutron [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.834 183087 INFO nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.854 183087 DEBUG nova.compute.manager [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.953 183087 DEBUG nova.compute.manager [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.954 183087 DEBUG nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.954 183087 INFO nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Creating image(s)
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.955 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "/var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.955 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "/var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.956 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "/var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.975 183087 DEBUG nova.policy [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:53:07 compute-1 nova_compute[183083]: 2026-01-26 08:53:07.978 183087 DEBUG oslo_concurrency.processutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.043 183087 DEBUG oslo_concurrency.processutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.044 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.045 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.056 183087 DEBUG oslo_concurrency.processutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.117 183087 DEBUG oslo_concurrency.processutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.118 183087 DEBUG oslo_concurrency.processutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.160 183087 DEBUG oslo_concurrency.processutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.162 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.162 183087 DEBUG oslo_concurrency.processutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.222 183087 DEBUG oslo_concurrency.processutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.224 183087 DEBUG nova.virt.disk.api [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Checking if we can resize image /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.224 183087 DEBUG oslo_concurrency.processutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.284 183087 DEBUG oslo_concurrency.processutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.285 183087 DEBUG nova.virt.disk.api [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Cannot resize image /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.286 183087 DEBUG nova.objects.instance [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lazy-loading 'migration_context' on Instance uuid ecac5cdf-f0c1-4352-aed6-69098da46fcd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.302 183087 DEBUG nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.303 183087 DEBUG nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Ensure instance console log exists: /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.304 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.305 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.305 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.971 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.972 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.972 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.973 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:53:08 compute-1 nova_compute[183083]: 2026-01-26 08:53:08.988 183087 DEBUG nova.network.neutron [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Successfully created port: 3fca56be-1386-4c85-9cee-d17f63594484 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.159 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.160 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13765MB free_disk=113.09356307983398GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.160 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.160 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.230 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance ecac5cdf-f0c1-4352-aed6-69098da46fcd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.231 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.231 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.274 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.294 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.318 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.318 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.756 183087 DEBUG nova.network.neutron [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Successfully updated port: 3fca56be-1386-4c85-9cee-d17f63594484 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.770 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "refresh_cache-ecac5cdf-f0c1-4352-aed6-69098da46fcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.770 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquired lock "refresh_cache-ecac5cdf-f0c1-4352-aed6-69098da46fcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.770 183087 DEBUG nova.network.neutron [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.841 183087 DEBUG nova.compute.manager [req-a22f4899-8b08-47b5-a183-a1322392a93b req-8d89f7d1-c887-4f94-bc42-384720a2e22f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Received event network-changed-3fca56be-1386-4c85-9cee-d17f63594484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.842 183087 DEBUG nova.compute.manager [req-a22f4899-8b08-47b5-a183-a1322392a93b req-8d89f7d1-c887-4f94-bc42-384720a2e22f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Refreshing instance network info cache due to event network-changed-3fca56be-1386-4c85-9cee-d17f63594484. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.842 183087 DEBUG oslo_concurrency.lockutils [req-a22f4899-8b08-47b5-a183-a1322392a93b req-8d89f7d1-c887-4f94-bc42-384720a2e22f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-ecac5cdf-f0c1-4352-aed6-69098da46fcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:53:09 compute-1 nova_compute[183083]: 2026-01-26 08:53:09.914 183087 DEBUG nova.network.neutron [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.680 183087 DEBUG nova.network.neutron [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Updating instance_info_cache with network_info: [{"id": "3fca56be-1386-4c85-9cee-d17f63594484", "address": "fa:16:3e:c2:c2:1f", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fca56be-13", "ovs_interfaceid": "3fca56be-1386-4c85-9cee-d17f63594484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.701 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Releasing lock "refresh_cache-ecac5cdf-f0c1-4352-aed6-69098da46fcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.701 183087 DEBUG nova.compute.manager [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Instance network_info: |[{"id": "3fca56be-1386-4c85-9cee-d17f63594484", "address": "fa:16:3e:c2:c2:1f", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fca56be-13", "ovs_interfaceid": "3fca56be-1386-4c85-9cee-d17f63594484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.702 183087 DEBUG oslo_concurrency.lockutils [req-a22f4899-8b08-47b5-a183-a1322392a93b req-8d89f7d1-c887-4f94-bc42-384720a2e22f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-ecac5cdf-f0c1-4352-aed6-69098da46fcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.703 183087 DEBUG nova.network.neutron [req-a22f4899-8b08-47b5-a183-a1322392a93b req-8d89f7d1-c887-4f94-bc42-384720a2e22f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Refreshing network info cache for port 3fca56be-1386-4c85-9cee-d17f63594484 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.708 183087 DEBUG nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Start _get_guest_xml network_info=[{"id": "3fca56be-1386-4c85-9cee-d17f63594484", "address": "fa:16:3e:c2:c2:1f", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fca56be-13", "ovs_interfaceid": "3fca56be-1386-4c85-9cee-d17f63594484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encryption_options': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.716 183087 WARNING nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.722 183087 DEBUG nova.virt.libvirt.host [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.723 183087 DEBUG nova.virt.libvirt.host [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.737 183087 DEBUG nova.virt.libvirt.host [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.739 183087 DEBUG nova.virt.libvirt.host [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.739 183087 DEBUG nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.740 183087 DEBUG nova.virt.hardware [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T08:43:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7017323-cd35-470d-a718-faa6c6e97277',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.741 183087 DEBUG nova.virt.hardware [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.741 183087 DEBUG nova.virt.hardware [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.742 183087 DEBUG nova.virt.hardware [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.742 183087 DEBUG nova.virt.hardware [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.742 183087 DEBUG nova.virt.hardware [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.743 183087 DEBUG nova.virt.hardware [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.743 183087 DEBUG nova.virt.hardware [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.743 183087 DEBUG nova.virt.hardware [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.744 183087 DEBUG nova.virt.hardware [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.744 183087 DEBUG nova.virt.hardware [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.752 183087 DEBUG nova.virt.libvirt.vif [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:53:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-2131100295',display_name='tempest-server-test-2131100295',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-2131100295',id=41,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDY4V6+HgiJjlBp+sismnH//xIhvaHIyZ883PC4cnliN7XgVilpp1kBxLjUHPWqW57B2esyV5Ub1B2t4CQ7Ool6UJEN5MlISleL6j0D4sAsLrzbr7gNivUWbODLUMAl3Sw==',key_name='tempest-keypair-test-1986375941',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5d0c78b7cd584e4a90592d8ea01ce4ad',ramdisk_id='',reservation_id='r-v0teuk06',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkDefaultSecGroupTest-1876093813',owner_user_name='tempest-NetworkDefaultSecGroupTest-1876093813-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:53:07Z,user_data=None,user_id='988ebc31182f4c94813f94306e399a2d',uuid=ecac5cdf-f0c1-4352-aed6-69098da46fcd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3fca56be-1386-4c85-9cee-d17f63594484", "address": "fa:16:3e:c2:c2:1f", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fca56be-13", "ovs_interfaceid": "3fca56be-1386-4c85-9cee-d17f63594484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.753 183087 DEBUG nova.network.os_vif_util [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converting VIF {"id": "3fca56be-1386-4c85-9cee-d17f63594484", "address": "fa:16:3e:c2:c2:1f", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fca56be-13", "ovs_interfaceid": "3fca56be-1386-4c85-9cee-d17f63594484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.754 183087 DEBUG nova.network.os_vif_util [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:c2:1f,bridge_name='br-int',has_traffic_filtering=True,id=3fca56be-1386-4c85-9cee-d17f63594484,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fca56be-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.755 183087 DEBUG nova.objects.instance [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lazy-loading 'pci_devices' on Instance uuid ecac5cdf-f0c1-4352-aed6-69098da46fcd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.772 183087 DEBUG nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] End _get_guest_xml xml=<domain type="kvm">
Jan 26 08:53:10 compute-1 nova_compute[183083]:   <uuid>ecac5cdf-f0c1-4352-aed6-69098da46fcd</uuid>
Jan 26 08:53:10 compute-1 nova_compute[183083]:   <name>instance-00000029</name>
Jan 26 08:53:10 compute-1 nova_compute[183083]:   <memory>131072</memory>
Jan 26 08:53:10 compute-1 nova_compute[183083]:   <vcpu>1</vcpu>
Jan 26 08:53:10 compute-1 nova_compute[183083]:   <metadata>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <nova:name>tempest-server-test-2131100295</nova:name>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <nova:creationTime>2026-01-26 08:53:10</nova:creationTime>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <nova:flavor name="m1.nano">
Jan 26 08:53:10 compute-1 nova_compute[183083]:         <nova:memory>128</nova:memory>
Jan 26 08:53:10 compute-1 nova_compute[183083]:         <nova:disk>1</nova:disk>
Jan 26 08:53:10 compute-1 nova_compute[183083]:         <nova:swap>0</nova:swap>
Jan 26 08:53:10 compute-1 nova_compute[183083]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 08:53:10 compute-1 nova_compute[183083]:         <nova:vcpus>1</nova:vcpus>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       </nova:flavor>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <nova:owner>
Jan 26 08:53:10 compute-1 nova_compute[183083]:         <nova:user uuid="988ebc31182f4c94813f94306e399a2d">tempest-NetworkDefaultSecGroupTest-1876093813-project-member</nova:user>
Jan 26 08:53:10 compute-1 nova_compute[183083]:         <nova:project uuid="5d0c78b7cd584e4a90592d8ea01ce4ad">tempest-NetworkDefaultSecGroupTest-1876093813</nova:project>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       </nova:owner>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <nova:root type="image" uuid="13d1a20a-8003-4f19-aba7-ccbd9eff9b82"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <nova:ports>
Jan 26 08:53:10 compute-1 nova_compute[183083]:         <nova:port uuid="3fca56be-1386-4c85-9cee-d17f63594484">
Jan 26 08:53:10 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       </nova:ports>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     </nova:instance>
Jan 26 08:53:10 compute-1 nova_compute[183083]:   </metadata>
Jan 26 08:53:10 compute-1 nova_compute[183083]:   <sysinfo type="smbios">
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <system>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <entry name="manufacturer">RDO</entry>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <entry name="product">OpenStack Compute</entry>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <entry name="serial">ecac5cdf-f0c1-4352-aed6-69098da46fcd</entry>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <entry name="uuid">ecac5cdf-f0c1-4352-aed6-69098da46fcd</entry>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <entry name="family">Virtual Machine</entry>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     </system>
Jan 26 08:53:10 compute-1 nova_compute[183083]:   </sysinfo>
Jan 26 08:53:10 compute-1 nova_compute[183083]:   <os>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <boot dev="hd"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <smbios mode="sysinfo"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:   </os>
Jan 26 08:53:10 compute-1 nova_compute[183083]:   <features>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <acpi/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <apic/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <vmcoreinfo/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:   </features>
Jan 26 08:53:10 compute-1 nova_compute[183083]:   <clock offset="utc">
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <timer name="hpet" present="no"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:   </clock>
Jan 26 08:53:10 compute-1 nova_compute[183083]:   <cpu mode="host-model" match="exact">
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:   </cpu>
Jan 26 08:53:10 compute-1 nova_compute[183083]:   <devices>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <disk type="file" device="disk">
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <target dev="vda" bus="virtio"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <disk type="file" device="cdrom">
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.config"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <target dev="sda" bus="sata"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:c2:c2:1f"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <target dev="tap3fca56be-13"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <serial type="pty">
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <log file="/var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/console.log" append="off"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     </serial>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <video>
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     </video>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <input type="tablet" bus="usb"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <rng model="virtio">
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <backend model="random">/dev/urandom</backend>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     </rng>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <controller type="usb" index="0"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     <memballoon model="virtio">
Jan 26 08:53:10 compute-1 nova_compute[183083]:       <stats period="10"/>
Jan 26 08:53:10 compute-1 nova_compute[183083]:     </memballoon>
Jan 26 08:53:10 compute-1 nova_compute[183083]:   </devices>
Jan 26 08:53:10 compute-1 nova_compute[183083]: </domain>
Jan 26 08:53:10 compute-1 nova_compute[183083]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.773 183087 DEBUG nova.compute.manager [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Preparing to wait for external event network-vif-plugged-3fca56be-1386-4c85-9cee-d17f63594484 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.773 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.774 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.774 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.775 183087 DEBUG nova.virt.libvirt.vif [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:53:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-2131100295',display_name='tempest-server-test-2131100295',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-2131100295',id=41,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDY4V6+HgiJjlBp+sismnH//xIhvaHIyZ883PC4cnliN7XgVilpp1kBxLjUHPWqW57B2esyV5Ub1B2t4CQ7Ool6UJEN5MlISleL6j0D4sAsLrzbr7gNivUWbODLUMAl3Sw==',key_name='tempest-keypair-test-1986375941',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5d0c78b7cd584e4a90592d8ea01ce4ad',ramdisk_id='',reservation_id='r-v0teuk06',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkDefaultSecGroupTest-1876093813',owner_user_name='tempest-NetworkDefaultSecGroupTest-1876093813-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:53:07Z,user_data=None,user_id='988ebc31182f4c94813f94306e399a2d',uuid=ecac5cdf-f0c1-4352-aed6-69098da46fcd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3fca56be-1386-4c85-9cee-d17f63594484", "address": "fa:16:3e:c2:c2:1f", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fca56be-13", "ovs_interfaceid": "3fca56be-1386-4c85-9cee-d17f63594484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.776 183087 DEBUG nova.network.os_vif_util [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converting VIF {"id": "3fca56be-1386-4c85-9cee-d17f63594484", "address": "fa:16:3e:c2:c2:1f", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fca56be-13", "ovs_interfaceid": "3fca56be-1386-4c85-9cee-d17f63594484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.777 183087 DEBUG nova.network.os_vif_util [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:c2:1f,bridge_name='br-int',has_traffic_filtering=True,id=3fca56be-1386-4c85-9cee-d17f63594484,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fca56be-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.777 183087 DEBUG os_vif [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:c2:1f,bridge_name='br-int',has_traffic_filtering=True,id=3fca56be-1386-4c85-9cee-d17f63594484,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fca56be-13') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.778 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.779 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.780 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.785 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.786 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3fca56be-13, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.787 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3fca56be-13, col_values=(('external_ids', {'iface-id': '3fca56be-1386-4c85-9cee-d17f63594484', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c2:c2:1f', 'vm-uuid': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.830 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:10 compute-1 NetworkManager[55451]: <info>  [1769417590.8310] manager: (tap3fca56be-13): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.834 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.836 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.837 183087 INFO os_vif [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:c2:1f,bridge_name='br-int',has_traffic_filtering=True,id=3fca56be-1386-4c85-9cee-d17f63594484,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fca56be-13')
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.880 183087 DEBUG nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.881 183087 DEBUG nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.881 183087 DEBUG nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] No VIF found with MAC fa:16:3e:c2:c2:1f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.882 183087 INFO nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Using config drive
Jan 26 08:53:10 compute-1 nova_compute[183083]: 2026-01-26 08:53:10.911 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:11 compute-1 nova_compute[183083]: 2026-01-26 08:53:11.278 183087 INFO nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Creating config drive at /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.config
Jan 26 08:53:11 compute-1 nova_compute[183083]: 2026-01-26 08:53:11.283 183087 DEBUG oslo_concurrency.processutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx8vpuzik execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:53:11 compute-1 nova_compute[183083]: 2026-01-26 08:53:11.428 183087 DEBUG oslo_concurrency.processutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx8vpuzik" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:53:11 compute-1 kernel: tap3fca56be-13: entered promiscuous mode
Jan 26 08:53:11 compute-1 NetworkManager[55451]: <info>  [1769417591.4944] manager: (tap3fca56be-13): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Jan 26 08:53:11 compute-1 ovn_controller[95352]: 2026-01-26T08:53:11Z|00208|binding|INFO|Claiming lport 3fca56be-1386-4c85-9cee-d17f63594484 for this chassis.
Jan 26 08:53:11 compute-1 ovn_controller[95352]: 2026-01-26T08:53:11Z|00209|binding|INFO|3fca56be-1386-4c85-9cee-d17f63594484: Claiming fa:16:3e:c2:c2:1f 10.100.0.5
Jan 26 08:53:11 compute-1 nova_compute[183083]: 2026-01-26 08:53:11.494 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:11 compute-1 nova_compute[183083]: 2026-01-26 08:53:11.506 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:11 compute-1 ovn_controller[95352]: 2026-01-26T08:53:11Z|00210|binding|INFO|Setting lport 3fca56be-1386-4c85-9cee-d17f63594484 ovn-installed in OVS
Jan 26 08:53:11 compute-1 nova_compute[183083]: 2026-01-26 08:53:11.507 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:11 compute-1 nova_compute[183083]: 2026-01-26 08:53:11.509 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:11 compute-1 systemd-udevd[217248]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:53:11 compute-1 systemd-machined[154360]: New machine qemu-12-instance-00000029.
Jan 26 08:53:11 compute-1 systemd[1]: Started Virtual Machine qemu-12-instance-00000029.
Jan 26 08:53:11 compute-1 NetworkManager[55451]: <info>  [1769417591.5548] device (tap3fca56be-13): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 08:53:11 compute-1 NetworkManager[55451]: <info>  [1769417591.5553] device (tap3fca56be-13): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 08:53:11 compute-1 ovn_controller[95352]: 2026-01-26T08:53:11Z|00211|binding|INFO|Setting lport 3fca56be-1386-4c85-9cee-d17f63594484 up in Southbound
Jan 26 08:53:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:11.624 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:c2:1f 10.100.0.5'], port_security=['fa:16:3e:c2:c2:1f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a3ba4c82-55c4-4fcf-ba72-7b36eb733cd3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97b37269-bb63-498a-80e2-0f1154aea97c, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=3fca56be-1386-4c85-9cee-d17f63594484) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:53:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:11.654 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 3fca56be-1386-4c85-9cee-d17f63594484 in datapath ce3cd186-bdaf-40d4-a276-e9139fe3dfec bound to our chassis
Jan 26 08:53:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:11.656 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ce3cd186-bdaf-40d4-a276-e9139fe3dfec
Jan 26 08:53:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:11.670 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[f18b935a-fca2-4357-bb90-1157051e6af1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:11.672 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapce3cd186-b1 in ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 08:53:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:11.674 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapce3cd186-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 08:53:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:11.675 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[bfa605ff-d82b-433b-8678-44bf13ffe2ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:11.681 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[de2fa513-dbfb-437c-83ba-b6bb5eb94503]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:11.701 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[d34cf271-1f5a-4672-8118-bca54701895b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:11 compute-1 nova_compute[183083]: 2026-01-26 08:53:11.727 183087 DEBUG nova.network.neutron [req-a22f4899-8b08-47b5-a183-a1322392a93b req-8d89f7d1-c887-4f94-bc42-384720a2e22f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Updated VIF entry in instance network info cache for port 3fca56be-1386-4c85-9cee-d17f63594484. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:53:11 compute-1 nova_compute[183083]: 2026-01-26 08:53:11.728 183087 DEBUG nova.network.neutron [req-a22f4899-8b08-47b5-a183-a1322392a93b req-8d89f7d1-c887-4f94-bc42-384720a2e22f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Updating instance_info_cache with network_info: [{"id": "3fca56be-1386-4c85-9cee-d17f63594484", "address": "fa:16:3e:c2:c2:1f", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fca56be-13", "ovs_interfaceid": "3fca56be-1386-4c85-9cee-d17f63594484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:53:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:11.731 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[1e41ff27-7bf7-407f-952f-6306263d475d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:11.770 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[fbc430cd-cbda-4b25-9b50-bba495ec1684]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:11.777 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[d3fad7bd-f906-4a72-a064-e7d4b7432bcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:11 compute-1 NetworkManager[55451]: <info>  [1769417591.7804] manager: (tapce3cd186-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/77)
Jan 26 08:53:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:11.835 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[d2312f04-27a0-4498-82c4-81b1bb6b9c6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:11.839 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[3bc363f3-e942-4a77-880d-7fa824322e16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:11 compute-1 NetworkManager[55451]: <info>  [1769417591.8624] device (tapce3cd186-b0): carrier: link connected
Jan 26 08:53:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:11.870 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[58dcbcc4-b4a2-4945-8fdf-4630299bcaa5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:11.949 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[0c00c8da-a25e-4e30-8295-c7190bdf88cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce3cd186-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:90:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393246, 'reachable_time': 37781, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217288, 'error': None, 'target': 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:11.963 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[72cd17bd-8ca2-48b8-aa99-8185e469aed1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:9011'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 393246, 'tstamp': 393246}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217289, 'error': None, 'target': 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:11 compute-1 nova_compute[183083]: 2026-01-26 08:53:11.963 183087 DEBUG oslo_concurrency.lockutils [req-a22f4899-8b08-47b5-a183-a1322392a93b req-8d89f7d1-c887-4f94-bc42-384720a2e22f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-ecac5cdf-f0c1-4352-aed6-69098da46fcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:53:11 compute-1 nova_compute[183083]: 2026-01-26 08:53:11.972 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417591.972128, ecac5cdf-f0c1-4352-aed6-69098da46fcd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:53:11 compute-1 nova_compute[183083]: 2026-01-26 08:53:11.972 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] VM Started (Lifecycle Event)
Jan 26 08:53:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:11.981 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[54bd610a-6569-4b74-999f-ea1bfe9db837]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce3cd186-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:90:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393246, 'reachable_time': 37781, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217290, 'error': None, 'target': 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:12.011 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[f5f0392b-6278-447f-863d-b3049ce8ed37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.039 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.043 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417591.9740012, ecac5cdf-f0c1-4352-aed6-69098da46fcd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.043 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] VM Paused (Lifecycle Event)
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:12.059 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[ddaf3fdf-532b-4454-93fa-1ac5a0848fe2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:12.061 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce3cd186-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:12.061 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:12.062 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce3cd186-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.063 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:12 compute-1 NetworkManager[55451]: <info>  [1769417592.0643] manager: (tapce3cd186-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Jan 26 08:53:12 compute-1 kernel: tapce3cd186-b0: entered promiscuous mode
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.066 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:12.067 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapce3cd186-b0, col_values=(('external_ids', {'iface-id': '597f7eff-a379-4a5d-bffc-0e294ae89e61'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.068 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:12 compute-1 ovn_controller[95352]: 2026-01-26T08:53:12Z|00212|binding|INFO|Releasing lport 597f7eff-a379-4a5d-bffc-0e294ae89e61 from this chassis (sb_readonly=0)
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.083 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:12.084 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ce3cd186-bdaf-40d4-a276-e9139fe3dfec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ce3cd186-bdaf-40d4-a276-e9139fe3dfec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:12.085 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[80c199ca-b337-4068-86f0-b950d9e815a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:12.086 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]: global
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-ce3cd186-bdaf-40d4-a276-e9139fe3dfec
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/ce3cd186-bdaf-40d4-a276-e9139fe3dfec.pid.haproxy
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID ce3cd186-bdaf-40d4-a276-e9139fe3dfec
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 08:53:12 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:12.086 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'env', 'PROCESS_TAG=haproxy-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ce3cd186-bdaf-40d4-a276-e9139fe3dfec.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.192 183087 DEBUG nova.compute.manager [req-c41f43bb-1dae-4e0a-98e2-cbc007cac971 req-edcfec77-0566-4939-b44a-85e2189b5439 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Received event network-vif-plugged-3fca56be-1386-4c85-9cee-d17f63594484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.192 183087 DEBUG oslo_concurrency.lockutils [req-c41f43bb-1dae-4e0a-98e2-cbc007cac971 req-edcfec77-0566-4939-b44a-85e2189b5439 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.193 183087 DEBUG oslo_concurrency.lockutils [req-c41f43bb-1dae-4e0a-98e2-cbc007cac971 req-edcfec77-0566-4939-b44a-85e2189b5439 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.193 183087 DEBUG oslo_concurrency.lockutils [req-c41f43bb-1dae-4e0a-98e2-cbc007cac971 req-edcfec77-0566-4939-b44a-85e2189b5439 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.193 183087 DEBUG nova.compute.manager [req-c41f43bb-1dae-4e0a-98e2-cbc007cac971 req-edcfec77-0566-4939-b44a-85e2189b5439 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Processing event network-vif-plugged-3fca56be-1386-4c85-9cee-d17f63594484 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.194 183087 DEBUG nova.compute.manager [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.203 183087 DEBUG nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.207 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.222 183087 INFO nova.virt.libvirt.driver [-] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Instance spawned successfully.
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.222 183087 DEBUG nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.225 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417592.2025936, ecac5cdf-f0c1-4352-aed6-69098da46fcd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.225 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] VM Resumed (Lifecycle Event)
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.345 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.360 183087 DEBUG nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.360 183087 DEBUG nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.361 183087 DEBUG nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.361 183087 DEBUG nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.361 183087 DEBUG nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.362 183087 DEBUG nova.virt.libvirt.driver [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.365 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.517 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:53:12 compute-1 podman[217321]: 2026-01-26 08:53:12.470721078 +0000 UTC m=+0.021960011 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.642 183087 INFO nova.compute.manager [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Took 4.69 seconds to spawn the instance on the hypervisor.
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.643 183087 DEBUG nova.compute.manager [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:53:12 compute-1 podman[217321]: 2026-01-26 08:53:12.674601204 +0000 UTC m=+0.225840117 container create b72e0b7eadad308ec7240c1f5ee90e45645d7b7d245e25873689ed168f46e6b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.737 183087 INFO nova.compute.manager [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Took 5.15 seconds to build instance.
Jan 26 08:53:12 compute-1 systemd[1]: Started libpod-conmon-b72e0b7eadad308ec7240c1f5ee90e45645d7b7d245e25873689ed168f46e6b4.scope.
Jan 26 08:53:12 compute-1 nova_compute[183083]: 2026-01-26 08:53:12.803 183087 DEBUG oslo_concurrency.lockutils [None req-4f60a1fb-39fd-4d58-9ca1-365f17234c1d 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.289s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:12 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:53:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be71656cc2bd3a1fa2aae6a101e1095114b35e78ac0c31978658c67b61d80a2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 08:53:12 compute-1 podman[217321]: 2026-01-26 08:53:12.930833078 +0000 UTC m=+0.482072011 container init b72e0b7eadad308ec7240c1f5ee90e45645d7b7d245e25873689ed168f46e6b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:53:12 compute-1 podman[217321]: 2026-01-26 08:53:12.93621216 +0000 UTC m=+0.487451073 container start b72e0b7eadad308ec7240c1f5ee90e45645d7b7d245e25873689ed168f46e6b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 26 08:53:12 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[217336]: [NOTICE]   (217340) : New worker (217342) forked
Jan 26 08:53:12 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[217336]: [NOTICE]   (217340) : Loading success.
Jan 26 08:53:13 compute-1 nova_compute[183083]: 2026-01-26 08:53:13.290 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769417578.2888067, eefbc638-e91a-432b-84c3-2448060d96db => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:53:13 compute-1 nova_compute[183083]: 2026-01-26 08:53:13.290 183087 INFO nova.compute.manager [-] [instance: eefbc638-e91a-432b-84c3-2448060d96db] VM Stopped (Lifecycle Event)
Jan 26 08:53:13 compute-1 nova_compute[183083]: 2026-01-26 08:53:13.304 183087 DEBUG nova.compute.manager [None req-c10f9666-df9e-49ce-bcbd-dd87a9be2ccc - - - - - -] [instance: eefbc638-e91a-432b-84c3-2448060d96db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:53:14 compute-1 nova_compute[183083]: 2026-01-26 08:53:14.287 183087 DEBUG nova.compute.manager [req-bf1d789f-4df7-458d-9f1d-03d97ea337ac req-d632a264-014b-4168-853d-979dc22cc424 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Received event network-vif-plugged-3fca56be-1386-4c85-9cee-d17f63594484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:53:14 compute-1 nova_compute[183083]: 2026-01-26 08:53:14.287 183087 DEBUG oslo_concurrency.lockutils [req-bf1d789f-4df7-458d-9f1d-03d97ea337ac req-d632a264-014b-4168-853d-979dc22cc424 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:14 compute-1 nova_compute[183083]: 2026-01-26 08:53:14.287 183087 DEBUG oslo_concurrency.lockutils [req-bf1d789f-4df7-458d-9f1d-03d97ea337ac req-d632a264-014b-4168-853d-979dc22cc424 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:14 compute-1 nova_compute[183083]: 2026-01-26 08:53:14.288 183087 DEBUG oslo_concurrency.lockutils [req-bf1d789f-4df7-458d-9f1d-03d97ea337ac req-d632a264-014b-4168-853d-979dc22cc424 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:14 compute-1 nova_compute[183083]: 2026-01-26 08:53:14.288 183087 DEBUG nova.compute.manager [req-bf1d789f-4df7-458d-9f1d-03d97ea337ac req-d632a264-014b-4168-853d-979dc22cc424 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] No waiting events found dispatching network-vif-plugged-3fca56be-1386-4c85-9cee-d17f63594484 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:53:14 compute-1 nova_compute[183083]: 2026-01-26 08:53:14.288 183087 WARNING nova.compute.manager [req-bf1d789f-4df7-458d-9f1d-03d97ea337ac req-d632a264-014b-4168-853d-979dc22cc424 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Received unexpected event network-vif-plugged-3fca56be-1386-4c85-9cee-d17f63594484 for instance with vm_state active and task_state None.
Jan 26 08:53:14 compute-1 nova_compute[183083]: 2026-01-26 08:53:14.463 183087 INFO nova.compute.manager [None req-a3f6b595-c3f6-4c2a-afdc-bff8a17f7fca 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Get console output
Jan 26 08:53:14 compute-1 nova_compute[183083]: 2026-01-26 08:53:14.469 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:53:15 compute-1 podman[217351]: 2026-01-26 08:53:15.784368012 +0000 UTC m=+0.057034959 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 08:53:15 compute-1 nova_compute[183083]: 2026-01-26 08:53:15.875 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:15 compute-1 nova_compute[183083]: 2026-01-26 08:53:15.909 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:16 compute-1 nova_compute[183083]: 2026-01-26 08:53:16.775 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769417581.7736547, 70d16f06-3d0a-454f-a1dd-87ce77ed8582 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:53:16 compute-1 nova_compute[183083]: 2026-01-26 08:53:16.775 183087 INFO nova.compute.manager [-] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] VM Stopped (Lifecycle Event)
Jan 26 08:53:16 compute-1 nova_compute[183083]: 2026-01-26 08:53:16.803 183087 DEBUG nova.compute.manager [None req-319e2ec1-b0ea-48e2-8cde-552cf33240c9 - - - - - -] [instance: 70d16f06-3d0a-454f-a1dd-87ce77ed8582] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:53:19 compute-1 nova_compute[183083]: 2026-01-26 08:53:19.572 183087 INFO nova.compute.manager [None req-5f4f48b8-1a98-4e78-ae01-9e553f2b5bdf 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Get console output
Jan 26 08:53:19 compute-1 nova_compute[183083]: 2026-01-26 08:53:19.579 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:53:20 compute-1 nova_compute[183083]: 2026-01-26 08:53:20.896 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:20 compute-1 nova_compute[183083]: 2026-01-26 08:53:20.912 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:23 compute-1 ovn_controller[95352]: 2026-01-26T08:53:23Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c2:c2:1f 10.100.0.5
Jan 26 08:53:23 compute-1 ovn_controller[95352]: 2026-01-26T08:53:23Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c2:c2:1f 10.100.0.5
Jan 26 08:53:24 compute-1 nova_compute[183083]: 2026-01-26 08:53:24.708 183087 INFO nova.compute.manager [None req-c6742729-4a27-4e2e-9480-4a8c61fc86ec 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Get console output
Jan 26 08:53:24 compute-1 nova_compute[183083]: 2026-01-26 08:53:24.712 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:53:25 compute-1 nova_compute[183083]: 2026-01-26 08:53:25.900 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:25 compute-1 nova_compute[183083]: 2026-01-26 08:53:25.915 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:28 compute-1 nova_compute[183083]: 2026-01-26 08:53:28.286 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:30 compute-1 podman[217389]: 2026-01-26 08:53:30.809428235 +0000 UTC m=+0.060235554 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 26 08:53:30 compute-1 podman[217388]: 2026-01-26 08:53:30.822258195 +0000 UTC m=+0.073489557 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Jan 26 08:53:30 compute-1 nova_compute[183083]: 2026-01-26 08:53:30.905 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:30 compute-1 nova_compute[183083]: 2026-01-26 08:53:30.917 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.128 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "bdc88e07-80b0-4781-8f76-fa751b8b7000" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.129 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "bdc88e07-80b0-4781-8f76-fa751b8b7000" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.142 183087 DEBUG nova.compute.manager [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.222 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.222 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.230 183087 DEBUG nova.virt.hardware [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.231 183087 INFO nova.compute.claims [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.354 183087 DEBUG nova.compute.provider_tree [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.367 183087 DEBUG nova.scheduler.client.report [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.392 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.393 183087 DEBUG nova.compute.manager [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.440 183087 DEBUG nova.compute.manager [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.441 183087 DEBUG nova.network.neutron [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.463 183087 INFO nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.481 183087 DEBUG nova.compute.manager [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.584 183087 DEBUG nova.policy [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.601 183087 DEBUG nova.compute.manager [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.602 183087 DEBUG nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.602 183087 INFO nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Creating image(s)
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.603 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "/var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.603 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "/var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.604 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "/var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.622 183087 DEBUG oslo_concurrency.processutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.684 183087 DEBUG oslo_concurrency.processutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.685 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.686 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.697 183087 DEBUG oslo_concurrency.processutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.749 183087 DEBUG oslo_concurrency.processutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.750 183087 DEBUG oslo_concurrency.processutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.784 183087 DEBUG oslo_concurrency.processutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.785 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.785 183087 DEBUG oslo_concurrency.processutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.849 183087 DEBUG oslo_concurrency.processutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.851 183087 DEBUG nova.virt.disk.api [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Checking if we can resize image /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.851 183087 DEBUG oslo_concurrency.processutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.907 183087 DEBUG oslo_concurrency.processutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.909 183087 DEBUG nova.virt.disk.api [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Cannot resize image /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.909 183087 DEBUG nova.objects.instance [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lazy-loading 'migration_context' on Instance uuid bdc88e07-80b0-4781-8f76-fa751b8b7000 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.929 183087 DEBUG nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.930 183087 DEBUG nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Ensure instance console log exists: /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.930 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.931 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:31 compute-1 nova_compute[183083]: 2026-01-26 08:53:31.931 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:32 compute-1 nova_compute[183083]: 2026-01-26 08:53:32.407 183087 DEBUG nova.network.neutron [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Successfully created port: 183afcdd-e728-4c4e-b370-a9d4517b3c30 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:53:33 compute-1 nova_compute[183083]: 2026-01-26 08:53:33.380 183087 DEBUG nova.network.neutron [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Successfully updated port: 183afcdd-e728-4c4e-b370-a9d4517b3c30 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:53:33 compute-1 nova_compute[183083]: 2026-01-26 08:53:33.393 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "refresh_cache-bdc88e07-80b0-4781-8f76-fa751b8b7000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:53:33 compute-1 nova_compute[183083]: 2026-01-26 08:53:33.394 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquired lock "refresh_cache-bdc88e07-80b0-4781-8f76-fa751b8b7000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:53:33 compute-1 nova_compute[183083]: 2026-01-26 08:53:33.394 183087 DEBUG nova.network.neutron [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:53:33 compute-1 nova_compute[183083]: 2026-01-26 08:53:33.587 183087 DEBUG nova.compute.manager [req-4e5d5dd3-3146-4bda-b415-4113cb46b208 req-6ce653b6-782d-484d-a86f-0de6cb55a042 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Received event network-changed-183afcdd-e728-4c4e-b370-a9d4517b3c30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:53:33 compute-1 nova_compute[183083]: 2026-01-26 08:53:33.588 183087 DEBUG nova.compute.manager [req-4e5d5dd3-3146-4bda-b415-4113cb46b208 req-6ce653b6-782d-484d-a86f-0de6cb55a042 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Refreshing instance network info cache due to event network-changed-183afcdd-e728-4c4e-b370-a9d4517b3c30. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:53:33 compute-1 nova_compute[183083]: 2026-01-26 08:53:33.588 183087 DEBUG oslo_concurrency.lockutils [req-4e5d5dd3-3146-4bda-b415-4113cb46b208 req-6ce653b6-782d-484d-a86f-0de6cb55a042 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-bdc88e07-80b0-4781-8f76-fa751b8b7000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:53:33 compute-1 nova_compute[183083]: 2026-01-26 08:53:33.612 183087 DEBUG nova.network.neutron [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.430 183087 DEBUG nova.network.neutron [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Updating instance_info_cache with network_info: [{"id": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "address": "fa:16:3e:3b:5f:95", "network": {"id": "cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6", "bridge": "br-int", "label": "tempest-test-network--874192606", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap183afcdd-e7", "ovs_interfaceid": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.451 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Releasing lock "refresh_cache-bdc88e07-80b0-4781-8f76-fa751b8b7000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.451 183087 DEBUG nova.compute.manager [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Instance network_info: |[{"id": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "address": "fa:16:3e:3b:5f:95", "network": {"id": "cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6", "bridge": "br-int", "label": "tempest-test-network--874192606", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap183afcdd-e7", "ovs_interfaceid": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.452 183087 DEBUG oslo_concurrency.lockutils [req-4e5d5dd3-3146-4bda-b415-4113cb46b208 req-6ce653b6-782d-484d-a86f-0de6cb55a042 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-bdc88e07-80b0-4781-8f76-fa751b8b7000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.452 183087 DEBUG nova.network.neutron [req-4e5d5dd3-3146-4bda-b415-4113cb46b208 req-6ce653b6-782d-484d-a86f-0de6cb55a042 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Refreshing network info cache for port 183afcdd-e728-4c4e-b370-a9d4517b3c30 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.457 183087 DEBUG nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Start _get_guest_xml network_info=[{"id": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "address": "fa:16:3e:3b:5f:95", "network": {"id": "cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6", "bridge": "br-int", "label": "tempest-test-network--874192606", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap183afcdd-e7", "ovs_interfaceid": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encryption_options': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.464 183087 WARNING nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.471 183087 DEBUG nova.virt.libvirt.host [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.472 183087 DEBUG nova.virt.libvirt.host [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.484 183087 DEBUG nova.virt.libvirt.host [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.485 183087 DEBUG nova.virt.libvirt.host [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.485 183087 DEBUG nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.486 183087 DEBUG nova.virt.hardware [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T08:43:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7017323-cd35-470d-a718-faa6c6e97277',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.487 183087 DEBUG nova.virt.hardware [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.487 183087 DEBUG nova.virt.hardware [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.487 183087 DEBUG nova.virt.hardware [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.488 183087 DEBUG nova.virt.hardware [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.488 183087 DEBUG nova.virt.hardware [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.488 183087 DEBUG nova.virt.hardware [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.489 183087 DEBUG nova.virt.hardware [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.489 183087 DEBUG nova.virt.hardware [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.490 183087 DEBUG nova.virt.hardware [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.490 183087 DEBUG nova.virt.hardware [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.496 183087 DEBUG nova.virt.libvirt.vif [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:53:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-630575445',display_name='tempest-server-test-630575445',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-630575445',id=42,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDY4V6+HgiJjlBp+sismnH//xIhvaHIyZ883PC4cnliN7XgVilpp1kBxLjUHPWqW57B2esyV5Ub1B2t4CQ7Ool6UJEN5MlISleL6j0D4sAsLrzbr7gNivUWbODLUMAl3Sw==',key_name='tempest-keypair-test-1986375941',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5d0c78b7cd584e4a90592d8ea01ce4ad',ramdisk_id='',reservation_id='r-lfxagesn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkDefaultSecGroupTest-1876093813',owner_user_name='tempest-NetworkDefaultSecGroupTest-1876093813-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:53:31Z,user_data=None,user_id='988ebc31182f4c94813f94306e399a2d',uuid=bdc88e07-80b0-4781-8f76-fa751b8b7000,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "address": "fa:16:3e:3b:5f:95", "network": {"id": "cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6", "bridge": "br-int", "label": "tempest-test-network--874192606", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap183afcdd-e7", "ovs_interfaceid": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.496 183087 DEBUG nova.network.os_vif_util [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converting VIF {"id": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "address": "fa:16:3e:3b:5f:95", "network": {"id": "cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6", "bridge": "br-int", "label": "tempest-test-network--874192606", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap183afcdd-e7", "ovs_interfaceid": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.497 183087 DEBUG nova.network.os_vif_util [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:5f:95,bridge_name='br-int',has_traffic_filtering=True,id=183afcdd-e728-4c4e-b370-a9d4517b3c30,network=Network(cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap183afcdd-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.498 183087 DEBUG nova.objects.instance [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lazy-loading 'pci_devices' on Instance uuid bdc88e07-80b0-4781-8f76-fa751b8b7000 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.519 183087 DEBUG nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] End _get_guest_xml xml=<domain type="kvm">
Jan 26 08:53:34 compute-1 nova_compute[183083]:   <uuid>bdc88e07-80b0-4781-8f76-fa751b8b7000</uuid>
Jan 26 08:53:34 compute-1 nova_compute[183083]:   <name>instance-0000002a</name>
Jan 26 08:53:34 compute-1 nova_compute[183083]:   <memory>131072</memory>
Jan 26 08:53:34 compute-1 nova_compute[183083]:   <vcpu>1</vcpu>
Jan 26 08:53:34 compute-1 nova_compute[183083]:   <metadata>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <nova:name>tempest-server-test-630575445</nova:name>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <nova:creationTime>2026-01-26 08:53:34</nova:creationTime>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <nova:flavor name="m1.nano">
Jan 26 08:53:34 compute-1 nova_compute[183083]:         <nova:memory>128</nova:memory>
Jan 26 08:53:34 compute-1 nova_compute[183083]:         <nova:disk>1</nova:disk>
Jan 26 08:53:34 compute-1 nova_compute[183083]:         <nova:swap>0</nova:swap>
Jan 26 08:53:34 compute-1 nova_compute[183083]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 08:53:34 compute-1 nova_compute[183083]:         <nova:vcpus>1</nova:vcpus>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       </nova:flavor>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <nova:owner>
Jan 26 08:53:34 compute-1 nova_compute[183083]:         <nova:user uuid="988ebc31182f4c94813f94306e399a2d">tempest-NetworkDefaultSecGroupTest-1876093813-project-member</nova:user>
Jan 26 08:53:34 compute-1 nova_compute[183083]:         <nova:project uuid="5d0c78b7cd584e4a90592d8ea01ce4ad">tempest-NetworkDefaultSecGroupTest-1876093813</nova:project>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       </nova:owner>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <nova:root type="image" uuid="13d1a20a-8003-4f19-aba7-ccbd9eff9b82"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <nova:ports>
Jan 26 08:53:34 compute-1 nova_compute[183083]:         <nova:port uuid="183afcdd-e728-4c4e-b370-a9d4517b3c30">
Jan 26 08:53:34 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="10.100.0.23" ipVersion="4"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       </nova:ports>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     </nova:instance>
Jan 26 08:53:34 compute-1 nova_compute[183083]:   </metadata>
Jan 26 08:53:34 compute-1 nova_compute[183083]:   <sysinfo type="smbios">
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <system>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <entry name="manufacturer">RDO</entry>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <entry name="product">OpenStack Compute</entry>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <entry name="serial">bdc88e07-80b0-4781-8f76-fa751b8b7000</entry>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <entry name="uuid">bdc88e07-80b0-4781-8f76-fa751b8b7000</entry>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <entry name="family">Virtual Machine</entry>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     </system>
Jan 26 08:53:34 compute-1 nova_compute[183083]:   </sysinfo>
Jan 26 08:53:34 compute-1 nova_compute[183083]:   <os>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <boot dev="hd"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <smbios mode="sysinfo"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:   </os>
Jan 26 08:53:34 compute-1 nova_compute[183083]:   <features>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <acpi/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <apic/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <vmcoreinfo/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:   </features>
Jan 26 08:53:34 compute-1 nova_compute[183083]:   <clock offset="utc">
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <timer name="hpet" present="no"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:   </clock>
Jan 26 08:53:34 compute-1 nova_compute[183083]:   <cpu mode="host-model" match="exact">
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:   </cpu>
Jan 26 08:53:34 compute-1 nova_compute[183083]:   <devices>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <disk type="file" device="disk">
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <target dev="vda" bus="virtio"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <disk type="file" device="cdrom">
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.config"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <target dev="sda" bus="sata"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:3b:5f:95"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <target dev="tap183afcdd-e7"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <serial type="pty">
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <log file="/var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/console.log" append="off"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     </serial>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <video>
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     </video>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <input type="tablet" bus="usb"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <rng model="virtio">
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <backend model="random">/dev/urandom</backend>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     </rng>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <controller type="usb" index="0"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     <memballoon model="virtio">
Jan 26 08:53:34 compute-1 nova_compute[183083]:       <stats period="10"/>
Jan 26 08:53:34 compute-1 nova_compute[183083]:     </memballoon>
Jan 26 08:53:34 compute-1 nova_compute[183083]:   </devices>
Jan 26 08:53:34 compute-1 nova_compute[183083]: </domain>
Jan 26 08:53:34 compute-1 nova_compute[183083]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.520 183087 DEBUG nova.compute.manager [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Preparing to wait for external event network-vif-plugged-183afcdd-e728-4c4e-b370-a9d4517b3c30 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.521 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "bdc88e07-80b0-4781-8f76-fa751b8b7000-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.521 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "bdc88e07-80b0-4781-8f76-fa751b8b7000-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.521 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "bdc88e07-80b0-4781-8f76-fa751b8b7000-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.522 183087 DEBUG nova.virt.libvirt.vif [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:53:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-630575445',display_name='tempest-server-test-630575445',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-630575445',id=42,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDY4V6+HgiJjlBp+sismnH//xIhvaHIyZ883PC4cnliN7XgVilpp1kBxLjUHPWqW57B2esyV5Ub1B2t4CQ7Ool6UJEN5MlISleL6j0D4sAsLrzbr7gNivUWbODLUMAl3Sw==',key_name='tempest-keypair-test-1986375941',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5d0c78b7cd584e4a90592d8ea01ce4ad',ramdisk_id='',reservation_id='r-lfxagesn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkDefaultSecGroupTest-1876093813',owner_user_name='tempest-NetworkDefaultSecGroupTest-1876093813-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:53:31Z,user_data=None,user_id='988ebc31182f4c94813f94306e399a2d',uuid=bdc88e07-80b0-4781-8f76-fa751b8b7000,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "address": "fa:16:3e:3b:5f:95", "network": {"id": "cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6", "bridge": "br-int", "label": "tempest-test-network--874192606", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap183afcdd-e7", "ovs_interfaceid": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.522 183087 DEBUG nova.network.os_vif_util [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converting VIF {"id": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "address": "fa:16:3e:3b:5f:95", "network": {"id": "cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6", "bridge": "br-int", "label": "tempest-test-network--874192606", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap183afcdd-e7", "ovs_interfaceid": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.523 183087 DEBUG nova.network.os_vif_util [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:5f:95,bridge_name='br-int',has_traffic_filtering=True,id=183afcdd-e728-4c4e-b370-a9d4517b3c30,network=Network(cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap183afcdd-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.524 183087 DEBUG os_vif [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:5f:95,bridge_name='br-int',has_traffic_filtering=True,id=183afcdd-e728-4c4e-b370-a9d4517b3c30,network=Network(cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap183afcdd-e7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.524 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.525 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.525 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.529 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.529 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap183afcdd-e7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.530 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap183afcdd-e7, col_values=(('external_ids', {'iface-id': '183afcdd-e728-4c4e-b370-a9d4517b3c30', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3b:5f:95', 'vm-uuid': 'bdc88e07-80b0-4781-8f76-fa751b8b7000'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.532 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:34 compute-1 NetworkManager[55451]: <info>  [1769417614.5333] manager: (tap183afcdd-e7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.534 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.542 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.544 183087 INFO os_vif [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:5f:95,bridge_name='br-int',has_traffic_filtering=True,id=183afcdd-e728-4c4e-b370-a9d4517b3c30,network=Network(cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap183afcdd-e7')
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.603 183087 DEBUG nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.603 183087 DEBUG nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.603 183087 DEBUG nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] No VIF found with MAC fa:16:3e:3b:5f:95, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.604 183087 INFO nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Using config drive
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.935 183087 INFO nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Creating config drive at /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.config
Jan 26 08:53:34 compute-1 nova_compute[183083]: 2026-01-26 08:53:34.941 183087 DEBUG oslo_concurrency.processutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3_umlfnw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.070 183087 DEBUG oslo_concurrency.processutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3_umlfnw" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:53:35 compute-1 kernel: tap183afcdd-e7: entered promiscuous mode
Jan 26 08:53:35 compute-1 NetworkManager[55451]: <info>  [1769417615.1579] manager: (tap183afcdd-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Jan 26 08:53:35 compute-1 ovn_controller[95352]: 2026-01-26T08:53:35Z|00213|binding|INFO|Claiming lport 183afcdd-e728-4c4e-b370-a9d4517b3c30 for this chassis.
Jan 26 08:53:35 compute-1 ovn_controller[95352]: 2026-01-26T08:53:35Z|00214|binding|INFO|183afcdd-e728-4c4e-b370-a9d4517b3c30: Claiming fa:16:3e:3b:5f:95 10.100.0.23
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.156 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.167 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:5f:95 10.100.0.23'], port_security=['fa:16:3e:3b:5f:95 10.100.0.23'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.23/28', 'neutron:device_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a3ba4c82-55c4-4fcf-ba72-7b36eb733cd3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=506c646c-47ce-4c23-8b2d-329f437b8924, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=183afcdd-e728-4c4e-b370-a9d4517b3c30) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.170 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 183afcdd-e728-4c4e-b370-a9d4517b3c30 in datapath cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6 bound to our chassis
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.172 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.173 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:35 compute-1 ovn_controller[95352]: 2026-01-26T08:53:35Z|00215|binding|INFO|Setting lport 183afcdd-e728-4c4e-b370-a9d4517b3c30 ovn-installed in OVS
Jan 26 08:53:35 compute-1 ovn_controller[95352]: 2026-01-26T08:53:35Z|00216|binding|INFO|Setting lport 183afcdd-e728-4c4e-b370-a9d4517b3c30 up in Southbound
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.176 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.190 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[bc43d6b5-dafb-4550-839c-279b3ad2462a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.191 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcdbb53a4-d1 in ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 08:53:35 compute-1 systemd-udevd[217464]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.194 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcdbb53a4-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.194 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[dc3efbfd-cca2-4fa5-8c3c-71839f190dda]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.195 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[6c440949-6fc3-4ae3-9a65-b66308e31c81]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:35 compute-1 systemd-machined[154360]: New machine qemu-13-instance-0000002a.
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.207 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[3a9ff25d-f3eb-4da3-a7eb-d1ede1e59e52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:35 compute-1 NetworkManager[55451]: <info>  [1769417615.2102] device (tap183afcdd-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 08:53:35 compute-1 NetworkManager[55451]: <info>  [1769417615.2108] device (tap183afcdd-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 08:53:35 compute-1 systemd[1]: Started Virtual Machine qemu-13-instance-0000002a.
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.232 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[cb15f8f5-f17f-4621-ac7c-7a3adb2ca860]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.272 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[b9e8d268-34a2-4942-9580-1c212cc59b4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:35 compute-1 NetworkManager[55451]: <info>  [1769417615.2779] manager: (tapcdbb53a4-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/81)
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.276 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[efb5e680-fd2f-44bb-b65f-3a9999948d00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:35 compute-1 systemd-udevd[217468]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.314 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[53b151b3-2efe-4cd6-bb22-a576eaf2b853]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.318 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[08844515-dcf3-47f4-9e2b-a3a95b6a735b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:35 compute-1 NetworkManager[55451]: <info>  [1769417615.3434] device (tapcdbb53a4-d0): carrier: link connected
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.350 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[65385a52-31a2-4c11-b52f-416d72696865]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.369 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[289300d9-5b5d-431b-9de4-ed7cb8ddeefe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcdbb53a4-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:49:f4:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395594, 'reachable_time': 37350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217497, 'error': None, 'target': 'ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.386 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[44507228-dbe1-4d31-9a3c-a2fddc5a82e6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe49:f4a4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 395594, 'tstamp': 395594}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217498, 'error': None, 'target': 'ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.403 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[588f5691-0578-4fd2-b7f2-0f3235331ca7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcdbb53a4-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:49:f4:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395594, 'reachable_time': 37350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217499, 'error': None, 'target': 'ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.440 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[889174af-7aa0-4955-981a-c7107974bf63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.450 183087 DEBUG nova.compute.manager [req-b93d86a3-21ec-49cb-a8e1-fa1db27e8cd4 req-f63b131c-1c2f-4b94-b8b5-3c6f3787c7fe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Received event network-vif-plugged-183afcdd-e728-4c4e-b370-a9d4517b3c30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.452 183087 DEBUG oslo_concurrency.lockutils [req-b93d86a3-21ec-49cb-a8e1-fa1db27e8cd4 req-f63b131c-1c2f-4b94-b8b5-3c6f3787c7fe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "bdc88e07-80b0-4781-8f76-fa751b8b7000-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.453 183087 DEBUG oslo_concurrency.lockutils [req-b93d86a3-21ec-49cb-a8e1-fa1db27e8cd4 req-f63b131c-1c2f-4b94-b8b5-3c6f3787c7fe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "bdc88e07-80b0-4781-8f76-fa751b8b7000-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.453 183087 DEBUG oslo_concurrency.lockutils [req-b93d86a3-21ec-49cb-a8e1-fa1db27e8cd4 req-f63b131c-1c2f-4b94-b8b5-3c6f3787c7fe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "bdc88e07-80b0-4781-8f76-fa751b8b7000-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.454 183087 DEBUG nova.compute.manager [req-b93d86a3-21ec-49cb-a8e1-fa1db27e8cd4 req-f63b131c-1c2f-4b94-b8b5-3c6f3787c7fe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Processing event network-vif-plugged-183afcdd-e728-4c4e-b370-a9d4517b3c30 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.509 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[8bae71c8-f959-4e95-bec4-d7b541968027]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.511 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcdbb53a4-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.511 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.512 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcdbb53a4-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.538 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:35 compute-1 kernel: tapcdbb53a4-d0: entered promiscuous mode
Jan 26 08:53:35 compute-1 NetworkManager[55451]: <info>  [1769417615.5443] manager: (tapcdbb53a4-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.544 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcdbb53a4-d0, col_values=(('external_ids', {'iface-id': '4ea67d13-7ea2-496c-96b0-f4e26fe640b0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.545 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:35 compute-1 ovn_controller[95352]: 2026-01-26T08:53:35Z|00217|binding|INFO|Releasing lport 4ea67d13-7ea2-496c-96b0-f4e26fe640b0 from this chassis (sb_readonly=0)
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.548 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.549 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[1522944e-2818-4165-9b07-5af513242f67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.550 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: global
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6.pid.haproxy
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 08:53:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:53:35.550 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6', 'env', 'PROCESS_TAG=haproxy-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.559 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.577 183087 DEBUG nova.network.neutron [req-4e5d5dd3-3146-4bda-b415-4113cb46b208 req-6ce653b6-782d-484d-a86f-0de6cb55a042 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Updated VIF entry in instance network info cache for port 183afcdd-e728-4c4e-b370-a9d4517b3c30. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.577 183087 DEBUG nova.network.neutron [req-4e5d5dd3-3146-4bda-b415-4113cb46b208 req-6ce653b6-782d-484d-a86f-0de6cb55a042 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Updating instance_info_cache with network_info: [{"id": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "address": "fa:16:3e:3b:5f:95", "network": {"id": "cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6", "bridge": "br-int", "label": "tempest-test-network--874192606", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap183afcdd-e7", "ovs_interfaceid": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.596 183087 DEBUG oslo_concurrency.lockutils [req-4e5d5dd3-3146-4bda-b415-4113cb46b208 req-6ce653b6-782d-484d-a86f-0de6cb55a042 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-bdc88e07-80b0-4781-8f76-fa751b8b7000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.758 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417615.7583654, bdc88e07-80b0-4781-8f76-fa751b8b7000 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.759 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] VM Started (Lifecycle Event)
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.761 183087 DEBUG nova.compute.manager [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.765 183087 DEBUG nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.768 183087 INFO nova.virt.libvirt.driver [-] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Instance spawned successfully.
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.768 183087 DEBUG nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.780 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.786 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.790 183087 DEBUG nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.791 183087 DEBUG nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.791 183087 DEBUG nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.792 183087 DEBUG nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.792 183087 DEBUG nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.793 183087 DEBUG nova.virt.libvirt.driver [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.803 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.804 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417615.758621, bdc88e07-80b0-4781-8f76-fa751b8b7000 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.804 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] VM Paused (Lifecycle Event)
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.826 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.829 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417615.7639709, bdc88e07-80b0-4781-8f76-fa751b8b7000 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.829 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] VM Resumed (Lifecycle Event)
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.846 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.850 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.854 183087 INFO nova.compute.manager [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Took 4.25 seconds to spawn the instance on the hypervisor.
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.854 183087 DEBUG nova.compute.manager [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.880 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.918 183087 INFO nova.compute.manager [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Took 4.73 seconds to build instance.
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.919 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:35 compute-1 podman[217535]: 2026-01-26 08:53:35.940763729 +0000 UTC m=+0.064744168 container create 70ad2cfa53eb5b335ce07a80b07510da63eb82f125ac33829ef1fa1775492876 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 26 08:53:35 compute-1 nova_compute[183083]: 2026-01-26 08:53:35.943 183087 DEBUG oslo_concurrency.lockutils [None req-eb181cde-3739-47e9-962d-12263f9005b0 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "bdc88e07-80b0-4781-8f76-fa751b8b7000" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.814s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:35 compute-1 systemd[1]: Started libpod-conmon-70ad2cfa53eb5b335ce07a80b07510da63eb82f125ac33829ef1fa1775492876.scope.
Jan 26 08:53:35 compute-1 podman[217535]: 2026-01-26 08:53:35.901340402 +0000 UTC m=+0.025320871 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 08:53:36 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:53:36 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74c80c5cf0b6ef9e4c685b6160f265885588e27c6ccd9e683f2a9dd59b2ccf40/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 08:53:36 compute-1 podman[217535]: 2026-01-26 08:53:36.037765491 +0000 UTC m=+0.161745960 container init 70ad2cfa53eb5b335ce07a80b07510da63eb82f125ac33829ef1fa1775492876 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 26 08:53:36 compute-1 podman[217551]: 2026-01-26 08:53:36.043245903 +0000 UTC m=+0.062801130 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 26 08:53:36 compute-1 podman[217535]: 2026-01-26 08:53:36.046023006 +0000 UTC m=+0.170003445 container start 70ad2cfa53eb5b335ce07a80b07510da63eb82f125ac33829ef1fa1775492876 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 08:53:36 compute-1 podman[217552]: 2026-01-26 08:53:36.060788463 +0000 UTC m=+0.072645682 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 08:53:36 compute-1 neutron-haproxy-ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6[217558]: [NOTICE]   (217609) : New worker (217624) forked
Jan 26 08:53:36 compute-1 neutron-haproxy-ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6[217558]: [NOTICE]   (217609) : Loading success.
Jan 26 08:53:36 compute-1 podman[217548]: 2026-01-26 08:53:36.093909553 +0000 UTC m=+0.112151731 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2)
Jan 26 08:53:36 compute-1 nova_compute[183083]: 2026-01-26 08:53:36.357 183087 INFO nova.compute.manager [None req-da10633d-c3a5-44fa-8e1b-39aa6596d6b8 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Get console output
Jan 26 08:53:36 compute-1 nova_compute[183083]: 2026-01-26 08:53:36.363 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:53:37 compute-1 nova_compute[183083]: 2026-01-26 08:53:37.549 183087 DEBUG nova.compute.manager [req-93131fc0-1f9b-4b6a-bdad-dc20f8e6157f req-64fe10c0-fa79-4edf-8dce-590880de11aa 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Received event network-vif-plugged-183afcdd-e728-4c4e-b370-a9d4517b3c30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:53:37 compute-1 nova_compute[183083]: 2026-01-26 08:53:37.550 183087 DEBUG oslo_concurrency.lockutils [req-93131fc0-1f9b-4b6a-bdad-dc20f8e6157f req-64fe10c0-fa79-4edf-8dce-590880de11aa 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "bdc88e07-80b0-4781-8f76-fa751b8b7000-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:53:37 compute-1 nova_compute[183083]: 2026-01-26 08:53:37.550 183087 DEBUG oslo_concurrency.lockutils [req-93131fc0-1f9b-4b6a-bdad-dc20f8e6157f req-64fe10c0-fa79-4edf-8dce-590880de11aa 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "bdc88e07-80b0-4781-8f76-fa751b8b7000-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:53:37 compute-1 nova_compute[183083]: 2026-01-26 08:53:37.551 183087 DEBUG oslo_concurrency.lockutils [req-93131fc0-1f9b-4b6a-bdad-dc20f8e6157f req-64fe10c0-fa79-4edf-8dce-590880de11aa 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "bdc88e07-80b0-4781-8f76-fa751b8b7000-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:53:37 compute-1 nova_compute[183083]: 2026-01-26 08:53:37.551 183087 DEBUG nova.compute.manager [req-93131fc0-1f9b-4b6a-bdad-dc20f8e6157f req-64fe10c0-fa79-4edf-8dce-590880de11aa 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] No waiting events found dispatching network-vif-plugged-183afcdd-e728-4c4e-b370-a9d4517b3c30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:53:37 compute-1 nova_compute[183083]: 2026-01-26 08:53:37.552 183087 WARNING nova.compute.manager [req-93131fc0-1f9b-4b6a-bdad-dc20f8e6157f req-64fe10c0-fa79-4edf-8dce-590880de11aa 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Received unexpected event network-vif-plugged-183afcdd-e728-4c4e-b370-a9d4517b3c30 for instance with vm_state active and task_state None.
Jan 26 08:53:39 compute-1 nova_compute[183083]: 2026-01-26 08:53:39.533 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:40 compute-1 nova_compute[183083]: 2026-01-26 08:53:40.922 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:41 compute-1 nova_compute[183083]: 2026-01-26 08:53:41.548 183087 INFO nova.compute.manager [None req-511bbb85-266e-4678-9856-8f244f32da82 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Get console output
Jan 26 08:53:41 compute-1 nova_compute[183083]: 2026-01-26 08:53:41.556 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:53:44 compute-1 nova_compute[183083]: 2026-01-26 08:53:44.537 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:45 compute-1 nova_compute[183083]: 2026-01-26 08:53:45.925 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:46 compute-1 nova_compute[183083]: 2026-01-26 08:53:46.701 183087 INFO nova.compute.manager [None req-76ffd8a1-ba43-45f1-bf0e-b22d3271f823 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Get console output
Jan 26 08:53:46 compute-1 nova_compute[183083]: 2026-01-26 08:53:46.705 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:53:46 compute-1 podman[217649]: 2026-01-26 08:53:46.797340063 +0000 UTC m=+0.064174241 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 08:53:47 compute-1 ovn_controller[95352]: 2026-01-26T08:53:47Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3b:5f:95 10.100.0.23
Jan 26 08:53:47 compute-1 ovn_controller[95352]: 2026-01-26T08:53:47Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3b:5f:95 10.100.0.23
Jan 26 08:53:49 compute-1 nova_compute[183083]: 2026-01-26 08:53:49.540 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:50 compute-1 sshd-session[217674]: Invalid user admin from 159.223.236.81 port 59928
Jan 26 08:53:50 compute-1 sshd-session[217674]: Connection closed by invalid user admin 159.223.236.81 port 59928 [preauth]
Jan 26 08:53:50 compute-1 nova_compute[183083]: 2026-01-26 08:53:50.927 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:51 compute-1 nova_compute[183083]: 2026-01-26 08:53:51.815 183087 INFO nova.compute.manager [None req-b98191df-43d2-4806-b09a-d323ca03f13f 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Get console output
Jan 26 08:53:51 compute-1 nova_compute[183083]: 2026-01-26 08:53:51.819 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 08:53:54 compute-1 nova_compute[183083]: 2026-01-26 08:53:54.543 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:54 compute-1 nova_compute[183083]: 2026-01-26 08:53:54.820 183087 DEBUG nova.compute.manager [req-9674d14d-fe11-42d3-bd7c-4c5b49ee5877 req-617b3aed-7c59-43a6-9e39-67b8830184a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Received event network-changed-3fca56be-1386-4c85-9cee-d17f63594484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:53:54 compute-1 nova_compute[183083]: 2026-01-26 08:53:54.820 183087 DEBUG nova.compute.manager [req-9674d14d-fe11-42d3-bd7c-4c5b49ee5877 req-617b3aed-7c59-43a6-9e39-67b8830184a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Refreshing instance network info cache due to event network-changed-3fca56be-1386-4c85-9cee-d17f63594484. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:53:54 compute-1 nova_compute[183083]: 2026-01-26 08:53:54.820 183087 DEBUG oslo_concurrency.lockutils [req-9674d14d-fe11-42d3-bd7c-4c5b49ee5877 req-617b3aed-7c59-43a6-9e39-67b8830184a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-ecac5cdf-f0c1-4352-aed6-69098da46fcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:53:54 compute-1 nova_compute[183083]: 2026-01-26 08:53:54.825 183087 DEBUG oslo_concurrency.lockutils [req-9674d14d-fe11-42d3-bd7c-4c5b49ee5877 req-617b3aed-7c59-43a6-9e39-67b8830184a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-ecac5cdf-f0c1-4352-aed6-69098da46fcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:53:54 compute-1 nova_compute[183083]: 2026-01-26 08:53:54.826 183087 DEBUG nova.network.neutron [req-9674d14d-fe11-42d3-bd7c-4c5b49ee5877 req-617b3aed-7c59-43a6-9e39-67b8830184a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Refreshing network info cache for port 3fca56be-1386-4c85-9cee-d17f63594484 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:53:55 compute-1 nova_compute[183083]: 2026-01-26 08:53:55.929 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:53:56 compute-1 nova_compute[183083]: 2026-01-26 08:53:56.483 183087 DEBUG nova.network.neutron [req-9674d14d-fe11-42d3-bd7c-4c5b49ee5877 req-617b3aed-7c59-43a6-9e39-67b8830184a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Updated VIF entry in instance network info cache for port 3fca56be-1386-4c85-9cee-d17f63594484. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:53:56 compute-1 nova_compute[183083]: 2026-01-26 08:53:56.484 183087 DEBUG nova.network.neutron [req-9674d14d-fe11-42d3-bd7c-4c5b49ee5877 req-617b3aed-7c59-43a6-9e39-67b8830184a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Updating instance_info_cache with network_info: [{"id": "3fca56be-1386-4c85-9cee-d17f63594484", "address": "fa:16:3e:c2:c2:1f", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fca56be-13", "ovs_interfaceid": "3fca56be-1386-4c85-9cee-d17f63594484", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:53:56 compute-1 nova_compute[183083]: 2026-01-26 08:53:56.769 183087 DEBUG oslo_concurrency.lockutils [req-9674d14d-fe11-42d3-bd7c-4c5b49ee5877 req-617b3aed-7c59-43a6-9e39-67b8830184a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-ecac5cdf-f0c1-4352-aed6-69098da46fcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:53:56 compute-1 nova_compute[183083]: 2026-01-26 08:53:56.896 183087 DEBUG nova.compute.manager [req-3526d13a-8873-4535-ba8e-bb44db8aa8c7 req-634026a0-1aec-48ad-898d-236a6a5745a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Received event network-changed-183afcdd-e728-4c4e-b370-a9d4517b3c30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:53:56 compute-1 nova_compute[183083]: 2026-01-26 08:53:56.897 183087 DEBUG nova.compute.manager [req-3526d13a-8873-4535-ba8e-bb44db8aa8c7 req-634026a0-1aec-48ad-898d-236a6a5745a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Refreshing instance network info cache due to event network-changed-183afcdd-e728-4c4e-b370-a9d4517b3c30. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:53:56 compute-1 nova_compute[183083]: 2026-01-26 08:53:56.897 183087 DEBUG oslo_concurrency.lockutils [req-3526d13a-8873-4535-ba8e-bb44db8aa8c7 req-634026a0-1aec-48ad-898d-236a6a5745a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-bdc88e07-80b0-4781-8f76-fa751b8b7000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:53:56 compute-1 nova_compute[183083]: 2026-01-26 08:53:56.898 183087 DEBUG oslo_concurrency.lockutils [req-3526d13a-8873-4535-ba8e-bb44db8aa8c7 req-634026a0-1aec-48ad-898d-236a6a5745a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-bdc88e07-80b0-4781-8f76-fa751b8b7000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:53:56 compute-1 nova_compute[183083]: 2026-01-26 08:53:56.898 183087 DEBUG nova.network.neutron [req-3526d13a-8873-4535-ba8e-bb44db8aa8c7 req-634026a0-1aec-48ad-898d-236a6a5745a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Refreshing network info cache for port 183afcdd-e728-4c4e-b370-a9d4517b3c30 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:53:57 compute-1 ovn_controller[95352]: 2026-01-26T08:53:57Z|00033|pinctrl(ovn_pinctrl0)|WARN|truncated dns packet
Jan 26 08:53:57 compute-1 ovn_controller[95352]: 2026-01-26T08:53:57Z|00034|pinctrl(ovn_pinctrl0)|WARN|truncated dns packet
Jan 26 08:53:59 compute-1 nova_compute[183083]: 2026-01-26 08:53:59.298 183087 DEBUG nova.network.neutron [req-3526d13a-8873-4535-ba8e-bb44db8aa8c7 req-634026a0-1aec-48ad-898d-236a6a5745a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Updated VIF entry in instance network info cache for port 183afcdd-e728-4c4e-b370-a9d4517b3c30. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:53:59 compute-1 nova_compute[183083]: 2026-01-26 08:53:59.298 183087 DEBUG nova.network.neutron [req-3526d13a-8873-4535-ba8e-bb44db8aa8c7 req-634026a0-1aec-48ad-898d-236a6a5745a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Updating instance_info_cache with network_info: [{"id": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "address": "fa:16:3e:3b:5f:95", "network": {"id": "cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6", "bridge": "br-int", "label": "tempest-test-network--874192606", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap183afcdd-e7", "ovs_interfaceid": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:53:59 compute-1 nova_compute[183083]: 2026-01-26 08:53:59.510 183087 DEBUG oslo_concurrency.lockutils [req-3526d13a-8873-4535-ba8e-bb44db8aa8c7 req-634026a0-1aec-48ad-898d-236a6a5745a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-bdc88e07-80b0-4781-8f76-fa751b8b7000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:53:59 compute-1 nova_compute[183083]: 2026-01-26 08:53:59.606 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:00 compute-1 nova_compute[183083]: 2026-01-26 08:54:00.967 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:01 compute-1 podman[217678]: 2026-01-26 08:54:01.825112404 +0000 UTC m=+0.079573717 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:54:01 compute-1 podman[217679]: 2026-01-26 08:54:01.83409868 +0000 UTC m=+0.083424741 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=openstack_network_exporter, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.743 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'name': 'tempest-server-test-2131100295', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000029', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'user_id': '988ebc31182f4c94813f94306e399a2d', 'hostId': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.745 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'name': 'tempest-server-test-630575445', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002a', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'user_id': '988ebc31182f4c94813f94306e399a2d', 'hostId': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.746 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.751 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ecac5cdf-f0c1-4352-aed6-69098da46fcd / tap3fca56be-13 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.751 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.754 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for bdc88e07-80b0-4781-8f76-fa751b8b7000 / tap183afcdd-e7 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.755 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24edb4f9-2f84-46ad-93f3-8fc052cfaa5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000029-ecac5cdf-f0c1-4352-aed6-69098da46fcd-tap3fca56be-13', 'timestamp': '2026-01-26T08:54:03.746485', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'tap3fca56be-13', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:c2:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3fca56be-13'}, 'message_id': '90a81170-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.408461612, 'message_signature': '440283e98bbeb4dee0a9eb9f0c1bd7d34ee3b9b30cf96362c3d8b159e659e348'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-0000002a-bdc88e07-80b0-4781-8f76-fa751b8b7000-tap183afcdd-e7', 'timestamp': '2026-01-26T08:54:03.746485', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'tap183afcdd-e7', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:5f:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap183afcdd-e7'}, 'message_id': '90a88470-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41447286, 'message_signature': '7ca1d016b952d978c5f21bf42f04f501f70f51b1196d0569c60e37234a03f0a8'}]}, 'timestamp': '2026-01-26 08:54:03.755385', '_unique_id': 'e38369c1735d4ebbb8c4d938e8520ad2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.756 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.757 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.791 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.device.write.requests volume: 295 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.792 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.835 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.device.write.requests volume: 306 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.836 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e579ee99-d061-4332-8a13-bf2016b58faa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 295, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd-vda', 'timestamp': '2026-01-26T08:54:03.757545', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90ae1f20-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41952749, 'message_signature': 'ebafbeae3a6da33f68eeb2da7c20a3a748cdbd2e07bfd7d405de9c5ec795e9ab'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd-sda', 'timestamp': '2026-01-26T08:54:03.757545', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90ae38f2-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41952749, 'message_signature': 'fb29b257cd73e36cd0fd42c9382d529cffbd2257820b2e6d5a85bee6babe4da8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 306, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000-vda', 'timestamp': '2026-01-26T08:54:03.757545', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90b4d626-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.454841474, 'message_signature': '3347c6747200fd6e52e2ae56d35e45b6fdce66c27872a54e0f08fce4638d9ba3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000-sda', 'timestamp': '2026-01-26T08:54:03.757545', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90b4f30e-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.454841474, 'message_signature': 'ad1eec5c6de32e8e35b9d96740ae3ebd0372deaf0a65a7e57bef5ece092d39d7'}]}, 'timestamp': '2026-01-26 08:54:03.836939', '_unique_id': '9ae9a504ee2a4ef78c5c6745a0652463'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.838 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.839 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.840 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.841 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c701f7f-3f2e-40e1-9062-7596626c62d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000029-ecac5cdf-f0c1-4352-aed6-69098da46fcd-tap3fca56be-13', 'timestamp': '2026-01-26T08:54:03.840822', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'tap3fca56be-13', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:c2:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3fca56be-13'}, 'message_id': '90b5a312-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.408461612, 'message_signature': '06da6415a09b53a7c9f925fb1b2491fb60d3b9fad341a2260033ac8480c85df7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-0000002a-bdc88e07-80b0-4781-8f76-fa751b8b7000-tap183afcdd-e7', 'timestamp': '2026-01-26T08:54:03.840822', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'tap183afcdd-e7', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:5f:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap183afcdd-e7'}, 'message_id': '90b5b5b4-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41447286, 'message_signature': '56c6aab124e728ebe7c030fd8ed100e5ccacc499ba9f7f1d3ba9070c859c4f8c'}]}, 'timestamp': '2026-01-26 08:54:03.841909', '_unique_id': 'aa01d2ca4789411f9e14f0fbfc227695'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.843 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.844 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.844 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.844 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-2131100295>, <NovaLikeServer: tempest-server-test-630575445>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2131100295>, <NovaLikeServer: tempest-server-test-630575445>]
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.844 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.845 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.device.write.bytes volume: 72986624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.845 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.846 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.device.write.bytes volume: 72986624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.846 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5991137-2883-4c07-aa5e-89e83ffe18ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72986624, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd-vda', 'timestamp': '2026-01-26T08:54:03.845100', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90b64556-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41952749, 'message_signature': '716f554124221509c4fd08e52adf17066f8fca67396c5c4b20d498d034fbcb6c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd-sda', 'timestamp': '2026-01-26T08:54:03.845100', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90b6565e-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41952749, 'message_signature': 'cf152f913f5bd52f826fa85341b4713a1108c89732dfbbdb28b9b41e260f9fe4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72986624, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000-vda', 'timestamp': '2026-01-26T08:54:03.845100', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90b66978-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.454841474, 'message_signature': '165056e7aab16be60718cf7cad20c1180cf8b63bd549143e5a4e4ecf6afc5564'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000-sda', 'timestamp': '2026-01-26T08:54:03.845100', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90b67940-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.454841474, 'message_signature': '8b6b58444d067a5dd113f036fb6912798f7a3e5e8c2b96cf6fa65c3f441f78b2'}]}, 'timestamp': '2026-01-26 08:54:03.846888', '_unique_id': '8e5abb38048f498084aa9108b01c7857'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.847 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.849 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.849 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.850 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c66b49c2-08bd-483b-a239-b4aff7827d41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000029-ecac5cdf-f0c1-4352-aed6-69098da46fcd-tap3fca56be-13', 'timestamp': '2026-01-26T08:54:03.849656', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'tap3fca56be-13', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:c2:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3fca56be-13'}, 'message_id': '90b6f712-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.408461612, 'message_signature': '2c72cc354da94fc7fed5cdb4f612f7ac05cd2dec74a728aa7ab358c0357de067'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-0000002a-bdc88e07-80b0-4781-8f76-fa751b8b7000-tap183afcdd-e7', 'timestamp': '2026-01-26T08:54:03.849656', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'tap183afcdd-e7', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:5f:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap183afcdd-e7'}, 'message_id': '90b70e46-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41447286, 'message_signature': 'c4f8057686af71f014b40b38050b1d38be097c652c5add3edbe0726e7122b27b'}]}, 'timestamp': '2026-01-26 08:54:03.850723', '_unique_id': 'fd6505bbd0fe420f93952db9dd214a73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.851 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.852 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.853 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.853 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f3559d0-6985-4529-9743-455f2e94a19d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000029-ecac5cdf-f0c1-4352-aed6-69098da46fcd-tap3fca56be-13', 'timestamp': '2026-01-26T08:54:03.853162', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'tap3fca56be-13', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:c2:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3fca56be-13'}, 'message_id': '90b78074-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.408461612, 'message_signature': '7c055e7600d0f61e719c08a25042ba60a86ef62d84f4ae0890cc864aab411c2f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-0000002a-bdc88e07-80b0-4781-8f76-fa751b8b7000-tap183afcdd-e7', 'timestamp': '2026-01-26T08:54:03.853162', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'tap183afcdd-e7', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:5f:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap183afcdd-e7'}, 'message_id': '90b7924e-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41447286, 'message_signature': 'ba9689f1785dbb9e604cd50257808ae0d4b3296b27798e33a45842c399805e73'}]}, 'timestamp': '2026-01-26 08:54:03.854162', '_unique_id': '87498f4064084370ba92c2a6c0a5da95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.855 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.856 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.856 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/network.outgoing.bytes volume: 3679 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.857 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/network.outgoing.bytes volume: 9160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a92368c-0861-4e63-8c4f-ec2c1fe26179', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3679, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000029-ecac5cdf-f0c1-4352-aed6-69098da46fcd-tap3fca56be-13', 'timestamp': '2026-01-26T08:54:03.856616', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'tap3fca56be-13', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:c2:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3fca56be-13'}, 'message_id': '90b8072e-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.408461612, 'message_signature': '3e4ae3662007c66fe4bcc2074c2984eb323229ca588e006b82d204eb31ab7111'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9160, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-0000002a-bdc88e07-80b0-4781-8f76-fa751b8b7000-tap183afcdd-e7', 'timestamp': '2026-01-26T08:54:03.856616', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'tap183afcdd-e7', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:5f:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap183afcdd-e7'}, 'message_id': '90b81b4c-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41447286, 'message_signature': 'a3462fb7c21df17502471f57ef65ae48e1b29fe815de703f7581a5fc1354c6a7'}]}, 'timestamp': '2026-01-26 08:54:03.857613', '_unique_id': '12d304b41de64b1b906948c40ef86423'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.858 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.860 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.860 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.device.read.requests volume: 1101 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.860 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.861 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.device.read.requests volume: 1063 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.861 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78537f3e-1f57-4b86-be37-2cae0d91b38d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1101, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd-vda', 'timestamp': '2026-01-26T08:54:03.860470', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90b89d7e-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41952749, 'message_signature': '26ff05ef1d2907b64c1d408832c1a7828b61e0e50449de7493319dc7572c324d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd-sda', 'timestamp': '2026-01-26T08:54:03.860470', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90b8afee-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41952749, 'message_signature': 'd042d81daa3afce08e4260651a83e1d30f5a192eb837d9aed3b15ae162ef9d06'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1063, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000-vda', 'timestamp': '2026-01-26T08:54:03.860470', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90b8c056-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.454841474, 'message_signature': '63f72792428e19b179fe0ac58a54559299b767c81d9397482a9f880acb14a352'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000-sda', 'timestamp': '2026-01-26T08:54:03.860470', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90b8d17c-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.454841474, 'message_signature': '12e0fd7eef318a31375b9cbd448826f5f73b5134448e0a40b62bef14d1192556'}]}, 'timestamp': '2026-01-26 08:54:03.862285', '_unique_id': '450d45fc0d0d4e24bfbd4d6d5ce2f225'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.863 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.864 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.882 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.883 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.899 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.900 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58363ec7-fe45-4e00-9cf2-947347d1f182', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd-vda', 'timestamp': '2026-01-26T08:54:03.864842', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90bc025c-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.526865017, 'message_signature': 'b89f1c28cd3520bcc6211f54ce4011067588eaa527819e4ff76cd67b33a5fb8c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd-sda', 'timestamp': '2026-01-26T08:54:03.864842', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90bc1b48-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.526865017, 'message_signature': 'b8ef7211d44f450205c8f8d9ac446c92f8f54b3ffb242c309fb9406c83c5cdaf'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000-vda', 'timestamp': '2026-01-26T08:54:03.864842', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90bea1f6-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.545859149, 'message_signature': '682f326e385b9d0b627bf2bfa3b35439e5c56c60a9977d67cd447da242c03176'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000-sda', 'timestamp': '2026-01-26T08:54:03.864842', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90beb916-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.545859149, 'message_signature': '6cd638de82deb0b48fbee831d26922b040f4a95721a92aa027b4f4b4eba3bf14'}]}, 'timestamp': '2026-01-26 08:54:03.901035', '_unique_id': '679b49dc26a741009c5a948e74a4bc17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.902 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.904 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.904 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.904 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.906 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.906 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd69ebbc2-5158-4c60-a93f-7ac6f42d4778', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd-vda', 'timestamp': '2026-01-26T08:54:03.904231', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90bf4f98-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.526865017, 'message_signature': '9264748eea64603773eaa2b333d52a6b20ce2cb2f0401268775c7d970638f2aa'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd-sda', 'timestamp': '2026-01-26T08:54:03.904231', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90bf763a-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.526865017, 'message_signature': 'a0323068a5a35e4dd629c87559b4c95c1ce2dee9963defd71efd44aecb8da9ad'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30482432, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000-vda', 'timestamp': '2026-01-26T08:54:03.904231', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90bf94d0-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.545859149, 'message_signature': '75ee4de62bbb673526ca6f249a7303fab0f7636236346292f4547d9a02cb3f11'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000-sda', 'timestamp': '2026-01-26T08:54:03.904231', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90bfacfe-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.545859149, 'message_signature': '0229d4ef0362e1f44528e65ca40ce477957383526fa4b3ac223e4c6c1091d3bb'}]}, 'timestamp': '2026-01-26 08:54:03.907344', '_unique_id': '42e64d71f39c48798c92e840637a6d7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.908 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.910 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.910 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.911 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-2131100295>, <NovaLikeServer: tempest-server-test-630575445>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2131100295>, <NovaLikeServer: tempest-server-test-630575445>]
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.911 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.device.write.latency volume: 2460783839 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.912 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.913 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.device.write.latency volume: 2348280106 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.913 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c944fcfa-901c-4898-bcd9-f7de4f4cc7d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2460783839, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd-vda', 'timestamp': '2026-01-26T08:54:03.911873', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90c07b16-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41952749, 'message_signature': '447e8bb40242514838909384b49250afbc53561b4867571a5f32b09785403766'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd-sda', 'timestamp': '2026-01-26T08:54:03.911873', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90c09416-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41952749, 'message_signature': 'cbb9dc345433d5991ef8c3483e3297cff6edd1fd79d3bbf16146485a83278526'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2348280106, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000-vda', 'timestamp': '2026-01-26T08:54:03.911873', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90c0ae10-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.454841474, 'message_signature': 'd1eb2b8ec347816e55c902c68bf8b33e5ba178a4300cb3932d3a87e6052654eb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000-sda', 'timestamp': '2026-01-26T08:54:03.911873', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90c0cb98-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.454841474, 'message_signature': 'f2dac070794675e5080f311a10656ef68aab5f42be6abf0b9b63af4f49547471'}]}, 'timestamp': '2026-01-26 08:54:03.914644', '_unique_id': 'd6a811d1e2ab4cae93a3c019e10af73e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.916 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.918 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.918 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.918 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-2131100295>, <NovaLikeServer: tempest-server-test-630575445>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2131100295>, <NovaLikeServer: tempest-server-test-630575445>]
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.918 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.942 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/cpu volume: 10650000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.966 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/cpu volume: 10340000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '000679fa-4841-408d-beb4-d8ce18ecaa7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10650000000, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'timestamp': '2026-01-26T08:54:03.919243', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '90c52ab2-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.60433778, 'message_signature': 'b3799c5d20f8b4e26ae42285979621baf6d20e104ce757f92a67dd007332e87b'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10340000000, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'timestamp': '2026-01-26T08:54:03.919243', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '90c8d98c-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.628462105, 'message_signature': '476be62c5dc7ba0b9029915e34a7c3e7c309be59046f0ed103e06cc1ffcddc97'}]}, 'timestamp': '2026-01-26 08:54:03.967402', '_unique_id': 'b6e0cbc1a61049dc95416f84a26d5461'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.968 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.970 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.970 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.device.read.latency volume: 232760037 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.971 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.device.read.latency volume: 26077635 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.971 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.device.read.latency volume: 472628261 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.972 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.device.read.latency volume: 25239977 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fae6035e-8680-4890-affd-aad0838ddece', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 232760037, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd-vda', 'timestamp': '2026-01-26T08:54:03.970707', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90c97554-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41952749, 'message_signature': '6f211f6e379e9f0d1468266fbb478bf6cc5c01f2d2c4a0e8e4538fd79bb6d181'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 26077635, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd-sda', 'timestamp': '2026-01-26T08:54:03.970707', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90c98774-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41952749, 'message_signature': 'da53dfad20530970de01d54d4a4751d22ae3ff87e5034f877e293d1552e14c5d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 472628261, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000-vda', 'timestamp': '2026-01-26T08:54:03.970707', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90c99ade-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.454841474, 'message_signature': 'd4f409b102196f571c3a48c8a3465425d213b4a50197edb4c421e9f2ffc93358'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 25239977, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000-sda', 'timestamp': '2026-01-26T08:54:03.970707', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90c9ac54-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.454841474, 'message_signature': 'e42da575ccf63bfb0fcc6d32f3cd809a70cce915501e6badd5070d3defe4166b'}]}, 'timestamp': '2026-01-26 08:54:03.972714', '_unique_id': '4882a8e12d0d4b5b8d695c49aa5ba9fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.974 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.975 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.975 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/network.incoming.packets volume: 21 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.976 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/network.incoming.packets volume: 48 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '386fa775-8883-4102-838b-0d1d49e09fb8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 21, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000029-ecac5cdf-f0c1-4352-aed6-69098da46fcd-tap3fca56be-13', 'timestamp': '2026-01-26T08:54:03.975644', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'tap3fca56be-13', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:c2:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3fca56be-13'}, 'message_id': '90ca308e-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.408461612, 'message_signature': 'cedfa7bd33f45a3fd2374f3fa28cb0cb965e518050234a4c37d1be99fdde7021'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 48, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-0000002a-bdc88e07-80b0-4781-8f76-fa751b8b7000-tap183afcdd-e7', 'timestamp': '2026-01-26T08:54:03.975644', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'tap183afcdd-e7', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:5f:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap183afcdd-e7'}, 'message_id': '90ca4c7c-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41447286, 'message_signature': 'f89fe3989e2dd37adc0dbfd10fb7f78c329aa7583b4ec909a01c5fa6177417e0'}]}, 'timestamp': '2026-01-26 08:54:03.976885', '_unique_id': 'f51e19075cba40eb92a52bbe993ec4f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.978 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.979 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/network.outgoing.packets volume: 31 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.979 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/network.outgoing.packets volume: 68 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c364813a-87b0-4255-8a63-1186bb4e177d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 31, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000029-ecac5cdf-f0c1-4352-aed6-69098da46fcd-tap3fca56be-13', 'timestamp': '2026-01-26T08:54:03.979138', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'tap3fca56be-13', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:c2:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3fca56be-13'}, 'message_id': '90cab914-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.408461612, 'message_signature': '14a4116ecc6a577224f0a0054a6e6e358130b3aa758fcdab74b35f2c70e1d85f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 68, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-0000002a-bdc88e07-80b0-4781-8f76-fa751b8b7000-tap183afcdd-e7', 'timestamp': '2026-01-26T08:54:03.979138', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'tap183afcdd-e7', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:5f:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap183afcdd-e7'}, 'message_id': '90cac904-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41447286, 'message_signature': '9ae541a0707d51d660f996fbbb61b68e3b8a74e929c565868138413d8c4d5d1f'}]}, 'timestamp': '2026-01-26 08:54:03.980004', '_unique_id': 'c6edf213f2e94b8e9bb50680021b1625'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.981 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.982 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.982 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.982 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56c07972-9c94-4079-a746-5771d3c5a1c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000029-ecac5cdf-f0c1-4352-aed6-69098da46fcd-tap3fca56be-13', 'timestamp': '2026-01-26T08:54:03.982182', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'tap3fca56be-13', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:c2:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3fca56be-13'}, 'message_id': '90cb2ce6-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.408461612, 'message_signature': '4fdb65965bcc39384f614e75cb48306a6f4a67619f7c191f4d29da3ba49646cc'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-0000002a-bdc88e07-80b0-4781-8f76-fa751b8b7000-tap183afcdd-e7', 'timestamp': '2026-01-26T08:54:03.982182', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'tap183afcdd-e7', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:5f:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap183afcdd-e7'}, 'message_id': '90cb3f6a-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41447286, 'message_signature': '6594d7cd6b6514ee1e79634a5c0025c361eaff010bd4464d1fca6fa06cc543c9'}]}, 'timestamp': '2026-01-26 08:54:03.982987', '_unique_id': '47cd57d18a99427db9cc644d0c84dc4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.983 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.985 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.985 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.985 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-2131100295>, <NovaLikeServer: tempest-server-test-630575445>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2131100295>, <NovaLikeServer: tempest-server-test-630575445>]
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.985 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.986 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.device.read.bytes volume: 30534144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.986 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.987 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.device.read.bytes volume: 29567488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.987 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77d31470-71ba-4738-9e8d-27a97e6db236', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30534144, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd-vda', 'timestamp': '2026-01-26T08:54:03.986125', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90cbca34-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41952749, 'message_signature': '4ccdc680d4a07802d35f42aaeaceef48953a22270f1d96351153ef40d149f831'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd-sda', 'timestamp': '2026-01-26T08:54:03.986125', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90cbde7a-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41952749, 'message_signature': '7d6708959b1877e0957d73e1678eb8474bdefbaa6e53e4d5127beeff98ace3f7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29567488, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000-vda', 'timestamp': '2026-01-26T08:54:03.986125', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90cbef82-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.454841474, 'message_signature': 'd8b1dab192cccf723175e0660c766136ee756272f49bffb4af74602ebeaf8381'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000-sda', 'timestamp': '2026-01-26T08:54:03.986125', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90cc0364-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.454841474, 'message_signature': 'e0e710c1df1d9c080e1f5b520c38efb38963a5c6ed86376c8840b9c6b4c4d5b6'}]}, 'timestamp': '2026-01-26 08:54:03.988064', '_unique_id': 'daa8299c885244d0be6b4814d928a61c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.989 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.990 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.990 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/network.incoming.bytes volume: 4049 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.990 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/network.incoming.bytes volume: 8648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcbe9ec4-5535-4b62-9478-63a46a4de0b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4049, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000029-ecac5cdf-f0c1-4352-aed6-69098da46fcd-tap3fca56be-13', 'timestamp': '2026-01-26T08:54:03.990189', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'tap3fca56be-13', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:c2:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3fca56be-13'}, 'message_id': '90cc65b6-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.408461612, 'message_signature': '55b824f9a1a7e184e807557c414b80619665dbd9058780580b8875e1f190e317'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8648, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-0000002a-bdc88e07-80b0-4781-8f76-fa751b8b7000-tap183afcdd-e7', 'timestamp': '2026-01-26T08:54:03.990189', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'tap183afcdd-e7', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:5f:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap183afcdd-e7'}, 'message_id': '90cc7268-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41447286, 'message_signature': 'e60ef024981d9b822a47d5fbe0b898ef3b2e33ae9942a05ae82cb1e309cb434e'}]}, 'timestamp': '2026-01-26 08:54:03.990913', '_unique_id': 'edbf992c0632433aaa81e4ba704744aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.991 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.992 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.992 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.993 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8c695dd-38a6-412e-bc41-99536f24923c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-00000029-ecac5cdf-f0c1-4352-aed6-69098da46fcd-tap3fca56be-13', 'timestamp': '2026-01-26T08:54:03.992716', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'tap3fca56be-13', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:c2:1f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3fca56be-13'}, 'message_id': '90cccbb4-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.408461612, 'message_signature': '09b1eabfe0e93a5f6511847759f151a8016480a8c78ae5cf78e3376b3b0930fd'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'instance-0000002a-bdc88e07-80b0-4781-8f76-fa751b8b7000-tap183afcdd-e7', 'timestamp': '2026-01-26T08:54:03.992716', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'tap183afcdd-e7', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:5f:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap183afcdd-e7'}, 'message_id': '90ccf652-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.41447286, 'message_signature': '59591c46bf6ef895cb55570e5dd8f73d809ef2f717cf47c7ef9be75fe4583777'}]}, 'timestamp': '2026-01-26 08:54:03.994442', '_unique_id': '2d3926b636fa40289385109dd7ffd993'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.995 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.996 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.996 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.996 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.997 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.997 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c44aab0-0606-424f-a17f-10d11438d6e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd-vda', 'timestamp': '2026-01-26T08:54:03.996488', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90cd5c46-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.526865017, 'message_signature': 'd37de72bb4d5396caf0d68d592e67aacdca1483828697e840a1b9a5aebeb030c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd-sda', 'timestamp': '2026-01-26T08:54:03.996488', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90cd6a60-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.526865017, 'message_signature': '7b31c469fe8b745323157dd3ca60140859a20010215b02fa0956cd668c1412bc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000-vda', 'timestamp': '2026-01-26T08:54:03.996488', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90cd78c0-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.545859149, 'message_signature': '3c0e3e60731f44d5bc292e13ca4cfdb007b38d400012e62d54c9ce59e40e367a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000-sda', 'timestamp': '2026-01-26T08:54:03.996488', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90cd872a-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.545859149, 'message_signature': '427121a48c638c57e15adf5d2e2a7ec573a6466e91df0c3e09788d94cd105b17'}]}, 'timestamp': '2026-01-26 08:54:03.997937', '_unique_id': '312af356e3364df18f0da611b293cd8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.998 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.999 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:03.999 12 DEBUG ceilometer.compute.pollsters [-] ecac5cdf-f0c1-4352-aed6-69098da46fcd/memory.usage volume: 46.47265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.000 12 DEBUG ceilometer.compute.pollsters [-] bdc88e07-80b0-4781-8f76-fa751b8b7000/memory.usage volume: 46.546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09571fe9-ef18-4dbc-87c1-8164f6999049', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.47265625, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'timestamp': '2026-01-26T08:54:03.999963', 'resource_metadata': {'display_name': 'tempest-server-test-2131100295', 'name': 'instance-00000029', 'instance_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '90cde670-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.60433778, 'message_signature': '23badb69a0acd36020fb688846c6ce8b08f18ee62c7d01404a481ebf155ff46a'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.546875, 'user_id': '988ebc31182f4c94813f94306e399a2d', 'user_name': None, 'project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'project_name': None, 'resource_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'timestamp': '2026-01-26T08:54:03.999963', 'resource_metadata': {'display_name': 'tempest-server-test-630575445', 'name': 'instance-0000002a', 'instance_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'instance_type': 'm1.nano', 'host': '8269a62f8e815674a8d12f02d34d3c890184cb4cf17fc1f5b416a9dd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '90cdf2f0-fa94-11f0-b28a-fa163efc69df', 'monotonic_time': 3984.628462105, 'message_signature': 'fa34b88b9e47f768c7566d5091f85b5ee04b4bf167a9cbdf3fe770ae77726045'}]}, 'timestamp': '2026-01-26 08:54:04.000678', '_unique_id': 'a434e732413947e89f22eec7a27e65a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 08:54:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:54:04.001 12 ERROR oslo_messaging.notify.messaging 
Jan 26 08:54:04 compute-1 nova_compute[183083]: 2026-01-26 08:54:04.609 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:54:05.305 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:54:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:54:05.306 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:54:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:54:05.307 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:54:05 compute-1 nova_compute[183083]: 2026-01-26 08:54:05.318 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:54:05 compute-1 nova_compute[183083]: 2026-01-26 08:54:05.318 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:54:05 compute-1 nova_compute[183083]: 2026-01-26 08:54:05.319 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:54:05 compute-1 nova_compute[183083]: 2026-01-26 08:54:05.970 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:06 compute-1 nova_compute[183083]: 2026-01-26 08:54:06.270 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "refresh_cache-ecac5cdf-f0c1-4352-aed6-69098da46fcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:54:06 compute-1 nova_compute[183083]: 2026-01-26 08:54:06.271 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquired lock "refresh_cache-ecac5cdf-f0c1-4352-aed6-69098da46fcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:54:06 compute-1 nova_compute[183083]: 2026-01-26 08:54:06.271 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 08:54:06 compute-1 nova_compute[183083]: 2026-01-26 08:54:06.271 183087 DEBUG nova.objects.instance [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lazy-loading 'info_cache' on Instance uuid ecac5cdf-f0c1-4352-aed6-69098da46fcd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:54:06 compute-1 ovn_controller[95352]: 2026-01-26T08:54:06Z|00218|pinctrl|WARN|Dropped 479 log messages in last 62 seconds (most recently, 8 seconds ago) due to excessive rate
Jan 26 08:54:06 compute-1 ovn_controller[95352]: 2026-01-26T08:54:06Z|00219|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:54:06 compute-1 podman[217717]: 2026-01-26 08:54:06.80617817 +0000 UTC m=+0.068380575 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 26 08:54:06 compute-1 podman[217718]: 2026-01-26 08:54:06.812441926 +0000 UTC m=+0.065150530 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 08:54:06 compute-1 podman[217716]: 2026-01-26 08:54:06.878535253 +0000 UTC m=+0.140317086 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:54:08 compute-1 nova_compute[183083]: 2026-01-26 08:54:08.091 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Updating instance_info_cache with network_info: [{"id": "3fca56be-1386-4c85-9cee-d17f63594484", "address": "fa:16:3e:c2:c2:1f", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fca56be-13", "ovs_interfaceid": "3fca56be-1386-4c85-9cee-d17f63594484", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:54:08 compute-1 nova_compute[183083]: 2026-01-26 08:54:08.290 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Releasing lock "refresh_cache-ecac5cdf-f0c1-4352-aed6-69098da46fcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:54:08 compute-1 nova_compute[183083]: 2026-01-26 08:54:08.291 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 08:54:08 compute-1 nova_compute[183083]: 2026-01-26 08:54:08.291 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:54:08 compute-1 nova_compute[183083]: 2026-01-26 08:54:08.292 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:54:08 compute-1 nova_compute[183083]: 2026-01-26 08:54:08.292 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:54:08 compute-1 nova_compute[183083]: 2026-01-26 08:54:08.920 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:54:08 compute-1 nova_compute[183083]: 2026-01-26 08:54:08.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:54:08 compute-1 nova_compute[183083]: 2026-01-26 08:54:08.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:54:08 compute-1 nova_compute[183083]: 2026-01-26 08:54:08.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:54:09 compute-1 nova_compute[183083]: 2026-01-26 08:54:09.612 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:09 compute-1 nova_compute[183083]: 2026-01-26 08:54:09.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:54:10 compute-1 nova_compute[183083]: 2026-01-26 08:54:10.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:54:10 compute-1 nova_compute[183083]: 2026-01-26 08:54:10.972 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:10 compute-1 nova_compute[183083]: 2026-01-26 08:54:10.991 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:54:10 compute-1 nova_compute[183083]: 2026-01-26 08:54:10.991 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:54:10 compute-1 nova_compute[183083]: 2026-01-26 08:54:10.991 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:54:10 compute-1 nova_compute[183083]: 2026-01-26 08:54:10.992 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.136 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.194 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.195 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.244 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.250 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.302 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.303 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.361 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.578 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.580 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13407MB free_disk=113.03545761108398GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.580 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.581 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.670 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance ecac5cdf-f0c1-4352-aed6-69098da46fcd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.671 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance bdc88e07-80b0-4781-8f76-fa751b8b7000 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.672 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.672 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=768MB phys_disk=119GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.727 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.822 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.845 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:54:11 compute-1 nova_compute[183083]: 2026-01-26 08:54:11.846 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:54:14 compute-1 nova_compute[183083]: 2026-01-26 08:54:14.615 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:15 compute-1 nova_compute[183083]: 2026-01-26 08:54:15.975 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:17 compute-1 podman[217794]: 2026-01-26 08:54:17.803836091 +0000 UTC m=+0.061399759 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 08:54:17 compute-1 ovn_controller[95352]: 2026-01-26T08:54:17Z|00035|pinctrl(ovn_pinctrl0)|WARN|truncated dns packet
Jan 26 08:54:17 compute-1 ovn_controller[95352]: 2026-01-26T08:54:17Z|00036|pinctrl(ovn_pinctrl0)|WARN|truncated dns packet
Jan 26 08:54:19 compute-1 nova_compute[183083]: 2026-01-26 08:54:19.618 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:20 compute-1 nova_compute[183083]: 2026-01-26 08:54:20.977 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:24 compute-1 nova_compute[183083]: 2026-01-26 08:54:24.669 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:25 compute-1 nova_compute[183083]: 2026-01-26 08:54:25.980 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:26 compute-1 ovn_controller[95352]: 2026-01-26T08:54:26Z|00220|memory_trim|INFO|Detected inactivity (last active 30020 ms ago): trimming memory
Jan 26 08:54:29 compute-1 nova_compute[183083]: 2026-01-26 08:54:29.672 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:30 compute-1 nova_compute[183083]: 2026-01-26 08:54:30.981 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:32 compute-1 podman[217819]: 2026-01-26 08:54:32.811321749 +0000 UTC m=+0.065978074 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Jan 26 08:54:32 compute-1 podman[217818]: 2026-01-26 08:54:32.813295168 +0000 UTC m=+0.063784400 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 26 08:54:34 compute-1 nova_compute[183083]: 2026-01-26 08:54:34.675 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:35 compute-1 nova_compute[183083]: 2026-01-26 08:54:35.983 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:37 compute-1 podman[217866]: 2026-01-26 08:54:37.800559946 +0000 UTC m=+0.060580555 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 26 08:54:37 compute-1 podman[217867]: 2026-01-26 08:54:37.822771673 +0000 UTC m=+0.069193699 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 08:54:37 compute-1 podman[217865]: 2026-01-26 08:54:37.850170925 +0000 UTC m=+0.110764171 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 08:54:39 compute-1 nova_compute[183083]: 2026-01-26 08:54:39.679 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:54:40.335 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:54:40 compute-1 nova_compute[183083]: 2026-01-26 08:54:40.335 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:54:40.337 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:54:40 compute-1 ovn_controller[95352]: 2026-01-26T08:54:40Z|00037|pinctrl(ovn_pinctrl0)|WARN|truncated dns packet
Jan 26 08:54:40 compute-1 nova_compute[183083]: 2026-01-26 08:54:40.986 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:43 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:54:43.341 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:54:44 compute-1 nova_compute[183083]: 2026-01-26 08:54:44.716 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:45 compute-1 nova_compute[183083]: 2026-01-26 08:54:45.988 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:48 compute-1 podman[217938]: 2026-01-26 08:54:48.807326416 +0000 UTC m=+0.063425619 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 08:54:49 compute-1 sshd-session[217936]: Invalid user admin from 159.223.236.81 port 59992
Jan 26 08:54:49 compute-1 nova_compute[183083]: 2026-01-26 08:54:49.719 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:49 compute-1 sshd-session[217936]: Connection closed by invalid user admin 159.223.236.81 port 59992 [preauth]
Jan 26 08:54:50 compute-1 nova_compute[183083]: 2026-01-26 08:54:50.990 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:54 compute-1 nova_compute[183083]: 2026-01-26 08:54:54.722 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:56 compute-1 nova_compute[183083]: 2026-01-26 08:54:56.043 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:54:59 compute-1 nova_compute[183083]: 2026-01-26 08:54:59.780 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:01 compute-1 nova_compute[183083]: 2026-01-26 08:55:01.085 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:01 compute-1 ovn_controller[95352]: 2026-01-26T08:55:01Z|00038|pinctrl(ovn_pinctrl0)|WARN|Dropped 1 log messages in last 21 seconds (most recently, 21 seconds ago) due to excessive rate
Jan 26 08:55:01 compute-1 ovn_controller[95352]: 2026-01-26T08:55:01Z|00039|pinctrl(ovn_pinctrl0)|WARN|truncated dns packet
Jan 26 08:55:03 compute-1 podman[217967]: 2026-01-26 08:55:03.830517669 +0000 UTC m=+0.089223362 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 26 08:55:03 compute-1 podman[217968]: 2026-01-26 08:55:03.844470592 +0000 UTC m=+0.102651230 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git)
Jan 26 08:55:04 compute-1 nova_compute[183083]: 2026-01-26 08:55:04.858 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:05.306 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:55:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:05.307 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:55:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:05.308 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:55:05 compute-1 nova_compute[183083]: 2026-01-26 08:55:05.848 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:55:05 compute-1 nova_compute[183083]: 2026-01-26 08:55:05.848 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:55:06 compute-1 nova_compute[183083]: 2026-01-26 08:55:06.088 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:06 compute-1 nova_compute[183083]: 2026-01-26 08:55:06.289 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "refresh_cache-bdc88e07-80b0-4781-8f76-fa751b8b7000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:55:06 compute-1 nova_compute[183083]: 2026-01-26 08:55:06.289 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquired lock "refresh_cache-bdc88e07-80b0-4781-8f76-fa751b8b7000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:55:06 compute-1 nova_compute[183083]: 2026-01-26 08:55:06.289 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 08:55:06 compute-1 sshd-session[218007]: Invalid user solana from 2.57.122.238 port 49018
Jan 26 08:55:06 compute-1 sshd-session[218007]: Connection closed by invalid user solana 2.57.122.238 port 49018 [preauth]
Jan 26 08:55:08 compute-1 nova_compute[183083]: 2026-01-26 08:55:08.603 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Updating instance_info_cache with network_info: [{"id": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "address": "fa:16:3e:3b:5f:95", "network": {"id": "cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6", "bridge": "br-int", "label": "tempest-test-network--874192606", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap183afcdd-e7", "ovs_interfaceid": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:55:08 compute-1 nova_compute[183083]: 2026-01-26 08:55:08.630 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Releasing lock "refresh_cache-bdc88e07-80b0-4781-8f76-fa751b8b7000" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:55:08 compute-1 nova_compute[183083]: 2026-01-26 08:55:08.630 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 08:55:08 compute-1 nova_compute[183083]: 2026-01-26 08:55:08.631 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:55:08 compute-1 nova_compute[183083]: 2026-01-26 08:55:08.631 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:55:08 compute-1 nova_compute[183083]: 2026-01-26 08:55:08.729 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:55:08 compute-1 nova_compute[183083]: 2026-01-26 08:55:08.730 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:55:08 compute-1 ovn_controller[95352]: 2026-01-26T08:55:08Z|00221|pinctrl|WARN|Dropped 127 log messages in last 62 seconds (most recently, 10 seconds ago) due to excessive rate
Jan 26 08:55:08 compute-1 ovn_controller[95352]: 2026-01-26T08:55:08Z|00222|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:55:08 compute-1 podman[218010]: 2026-01-26 08:55:08.814882022 +0000 UTC m=+0.075142915 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 26 08:55:08 compute-1 podman[218011]: 2026-01-26 08:55:08.830280168 +0000 UTC m=+0.092775587 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 08:55:08 compute-1 podman[218009]: 2026-01-26 08:55:08.832417952 +0000 UTC m=+0.103478295 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 26 08:55:08 compute-1 nova_compute[183083]: 2026-01-26 08:55:08.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:55:08 compute-1 nova_compute[183083]: 2026-01-26 08:55:08.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:55:08 compute-1 nova_compute[183083]: 2026-01-26 08:55:08.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:55:08 compute-1 nova_compute[183083]: 2026-01-26 08:55:08.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:55:09 compute-1 nova_compute[183083]: 2026-01-26 08:55:09.862 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:09 compute-1 nova_compute[183083]: 2026-01-26 08:55:09.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:55:10 compute-1 nova_compute[183083]: 2026-01-26 08:55:10.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:55:10 compute-1 nova_compute[183083]: 2026-01-26 08:55:10.974 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:55:10 compute-1 nova_compute[183083]: 2026-01-26 08:55:10.975 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:55:10 compute-1 nova_compute[183083]: 2026-01-26 08:55:10.975 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:55:10 compute-1 nova_compute[183083]: 2026-01-26 08:55:10.975 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.045 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.090 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.135 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.137 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.207 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.217 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.282 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.284 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.342 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.562 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.563 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13454MB free_disk=113.03545761108398GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.564 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.564 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.659 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance ecac5cdf-f0c1-4352-aed6-69098da46fcd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.660 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance bdc88e07-80b0-4781-8f76-fa751b8b7000 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.660 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.660 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=768MB phys_disk=119GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.677 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing inventories for resource provider 5203935e-446c-4e03-93fa-4c60d651e045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.696 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating ProviderTree inventory for provider 5203935e-446c-4e03-93fa-4c60d651e045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.696 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating inventory in ProviderTree for provider 5203935e-446c-4e03-93fa-4c60d651e045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.715 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing aggregate associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.734 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing trait associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.792 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.807 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.810 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:55:11 compute-1 nova_compute[183083]: 2026-01-26 08:55:11.810 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:55:14 compute-1 nova_compute[183083]: 2026-01-26 08:55:14.864 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:16 compute-1 nova_compute[183083]: 2026-01-26 08:55:16.092 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:19 compute-1 podman[218095]: 2026-01-26 08:55:19.815906025 +0000 UTC m=+0.074972230 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 08:55:19 compute-1 nova_compute[183083]: 2026-01-26 08:55:19.866 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:21 compute-1 nova_compute[183083]: 2026-01-26 08:55:21.125 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.086 183087 DEBUG oslo_concurrency.lockutils [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "bdc88e07-80b0-4781-8f76-fa751b8b7000" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.087 183087 DEBUG oslo_concurrency.lockutils [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "bdc88e07-80b0-4781-8f76-fa751b8b7000" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.087 183087 DEBUG oslo_concurrency.lockutils [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "bdc88e07-80b0-4781-8f76-fa751b8b7000-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.088 183087 DEBUG oslo_concurrency.lockutils [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "bdc88e07-80b0-4781-8f76-fa751b8b7000-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.088 183087 DEBUG oslo_concurrency.lockutils [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "bdc88e07-80b0-4781-8f76-fa751b8b7000-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.090 183087 INFO nova.compute.manager [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Terminating instance
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.092 183087 DEBUG nova.compute.manager [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:55:22 compute-1 kernel: tap183afcdd-e7 (unregistering): left promiscuous mode
Jan 26 08:55:22 compute-1 NetworkManager[55451]: <info>  [1769417722.1286] device (tap183afcdd-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 08:55:22 compute-1 ovn_controller[95352]: 2026-01-26T08:55:22Z|00223|binding|INFO|Releasing lport 183afcdd-e728-4c4e-b370-a9d4517b3c30 from this chassis (sb_readonly=0)
Jan 26 08:55:22 compute-1 ovn_controller[95352]: 2026-01-26T08:55:22Z|00224|binding|INFO|Setting lport 183afcdd-e728-4c4e-b370-a9d4517b3c30 down in Southbound
Jan 26 08:55:22 compute-1 ovn_controller[95352]: 2026-01-26T08:55:22Z|00225|binding|INFO|Removing iface tap183afcdd-e7 ovn-installed in OVS
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.139 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:22 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:22.148 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:5f:95 10.100.0.23'], port_security=['fa:16:3e:3b:5f:95 10.100.0.23'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.23/28', 'neutron:device_id': 'bdc88e07-80b0-4781-8f76-fa751b8b7000', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a3ba4c82-55c4-4fcf-ba72-7b36eb733cd3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.203'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=506c646c-47ce-4c23-8b2d-329f437b8924, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=183afcdd-e728-4c4e-b370-a9d4517b3c30) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:55:22 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:22.151 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 183afcdd-e728-4c4e-b370-a9d4517b3c30 in datapath cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6 unbound from our chassis
Jan 26 08:55:22 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:22.153 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 08:55:22 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:22.155 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b979e52f-ac5a-4b10-93f5-eb49dc655a8d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:55:22 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:22.156 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6 namespace which is not needed anymore
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.201 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:22 compute-1 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Jan 26 08:55:22 compute-1 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000002a.scope: Consumed 16.674s CPU time.
Jan 26 08:55:22 compute-1 systemd-machined[154360]: Machine qemu-13-instance-0000002a terminated.
Jan 26 08:55:22 compute-1 neutron-haproxy-ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6[217558]: [NOTICE]   (217609) : haproxy version is 2.8.14-c23fe91
Jan 26 08:55:22 compute-1 neutron-haproxy-ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6[217558]: [NOTICE]   (217609) : path to executable is /usr/sbin/haproxy
Jan 26 08:55:22 compute-1 neutron-haproxy-ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6[217558]: [WARNING]  (217609) : Exiting Master process...
Jan 26 08:55:22 compute-1 neutron-haproxy-ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6[217558]: [WARNING]  (217609) : Exiting Master process...
Jan 26 08:55:22 compute-1 neutron-haproxy-ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6[217558]: [ALERT]    (217609) : Current worker (217624) exited with code 143 (Terminated)
Jan 26 08:55:22 compute-1 neutron-haproxy-ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6[217558]: [WARNING]  (217609) : All workers exited. Exiting... (0)
Jan 26 08:55:22 compute-1 systemd[1]: libpod-70ad2cfa53eb5b335ce07a80b07510da63eb82f125ac33829ef1fa1775492876.scope: Deactivated successfully.
Jan 26 08:55:22 compute-1 podman[218143]: 2026-01-26 08:55:22.371212902 +0000 UTC m=+0.068887101 container died 70ad2cfa53eb5b335ce07a80b07510da63eb82f125ac33829ef1fa1775492876 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.374 183087 INFO nova.virt.libvirt.driver [-] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Instance destroyed successfully.
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.376 183087 DEBUG nova.objects.instance [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lazy-loading 'resources' on Instance uuid bdc88e07-80b0-4781-8f76-fa751b8b7000 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.394 183087 DEBUG nova.virt.libvirt.vif [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T08:53:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-630575445',display_name='tempest-server-test-630575445',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-630575445',id=42,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDY4V6+HgiJjlBp+sismnH//xIhvaHIyZ883PC4cnliN7XgVilpp1kBxLjUHPWqW57B2esyV5Ub1B2t4CQ7Ool6UJEN5MlISleL6j0D4sAsLrzbr7gNivUWbODLUMAl3Sw==',key_name='tempest-keypair-test-1986375941',keypairs=<?>,launch_index=0,launched_at=2026-01-26T08:53:35Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5d0c78b7cd584e4a90592d8ea01ce4ad',ramdisk_id='',reservation_id='r-lfxagesn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkDefaultSecGroupTest-1876093813',owner_user_name='tempest-NetworkDefaultSecGroupTest-1876093813-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T08:53:35Z,user_data=None,user_id='988ebc31182f4c94813f94306e399a2d',uuid=bdc88e07-80b0-4781-8f76-fa751b8b7000,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "address": "fa:16:3e:3b:5f:95", "network": {"id": "cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6", "bridge": "br-int", "label": "tempest-test-network--874192606", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap183afcdd-e7", "ovs_interfaceid": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.394 183087 DEBUG nova.network.os_vif_util [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converting VIF {"id": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "address": "fa:16:3e:3b:5f:95", "network": {"id": "cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6", "bridge": "br-int", "label": "tempest-test-network--874192606", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap183afcdd-e7", "ovs_interfaceid": "183afcdd-e728-4c4e-b370-a9d4517b3c30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.396 183087 DEBUG nova.network.os_vif_util [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3b:5f:95,bridge_name='br-int',has_traffic_filtering=True,id=183afcdd-e728-4c4e-b370-a9d4517b3c30,network=Network(cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap183afcdd-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.396 183087 DEBUG os_vif [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3b:5f:95,bridge_name='br-int',has_traffic_filtering=True,id=183afcdd-e728-4c4e-b370-a9d4517b3c30,network=Network(cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap183afcdd-e7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.400 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.401 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap183afcdd-e7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.403 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.406 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.410 183087 INFO os_vif [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3b:5f:95,bridge_name='br-int',has_traffic_filtering=True,id=183afcdd-e728-4c4e-b370-a9d4517b3c30,network=Network(cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap183afcdd-e7')
Jan 26 08:55:22 compute-1 systemd[1]: var-lib-containers-storage-overlay-74c80c5cf0b6ef9e4c685b6160f265885588e27c6ccd9e683f2a9dd59b2ccf40-merged.mount: Deactivated successfully.
Jan 26 08:55:22 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-70ad2cfa53eb5b335ce07a80b07510da63eb82f125ac33829ef1fa1775492876-userdata-shm.mount: Deactivated successfully.
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.411 183087 INFO nova.virt.libvirt.driver [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Deleting instance files /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000_del
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.413 183087 INFO nova.virt.libvirt.driver [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Deletion of /var/lib/nova/instances/bdc88e07-80b0-4781-8f76-fa751b8b7000_del complete
Jan 26 08:55:22 compute-1 podman[218143]: 2026-01-26 08:55:22.41740541 +0000 UTC m=+0.115079619 container cleanup 70ad2cfa53eb5b335ce07a80b07510da63eb82f125ac33829ef1fa1775492876 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 08:55:22 compute-1 systemd[1]: libpod-conmon-70ad2cfa53eb5b335ce07a80b07510da63eb82f125ac33829ef1fa1775492876.scope: Deactivated successfully.
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.481 183087 INFO nova.compute.manager [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.482 183087 DEBUG oslo.service.loopingcall [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.482 183087 DEBUG nova.compute.manager [-] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.482 183087 DEBUG nova.network.neutron [-] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.489 183087 DEBUG nova.compute.manager [req-42336e36-a396-4aa3-af88-266d36c90cfc req-25602003-5497-460b-ad8a-1e3ad440cd2e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Received event network-vif-unplugged-183afcdd-e728-4c4e-b370-a9d4517b3c30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.489 183087 DEBUG oslo_concurrency.lockutils [req-42336e36-a396-4aa3-af88-266d36c90cfc req-25602003-5497-460b-ad8a-1e3ad440cd2e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "bdc88e07-80b0-4781-8f76-fa751b8b7000-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.490 183087 DEBUG oslo_concurrency.lockutils [req-42336e36-a396-4aa3-af88-266d36c90cfc req-25602003-5497-460b-ad8a-1e3ad440cd2e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "bdc88e07-80b0-4781-8f76-fa751b8b7000-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.490 183087 DEBUG oslo_concurrency.lockutils [req-42336e36-a396-4aa3-af88-266d36c90cfc req-25602003-5497-460b-ad8a-1e3ad440cd2e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "bdc88e07-80b0-4781-8f76-fa751b8b7000-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.490 183087 DEBUG nova.compute.manager [req-42336e36-a396-4aa3-af88-266d36c90cfc req-25602003-5497-460b-ad8a-1e3ad440cd2e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] No waiting events found dispatching network-vif-unplugged-183afcdd-e728-4c4e-b370-a9d4517b3c30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.490 183087 DEBUG nova.compute.manager [req-42336e36-a396-4aa3-af88-266d36c90cfc req-25602003-5497-460b-ad8a-1e3ad440cd2e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Received event network-vif-unplugged-183afcdd-e728-4c4e-b370-a9d4517b3c30 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 08:55:22 compute-1 podman[218185]: 2026-01-26 08:55:22.496177872 +0000 UTC m=+0.053541746 container remove 70ad2cfa53eb5b335ce07a80b07510da63eb82f125ac33829ef1fa1775492876 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 08:55:22 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:22.504 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[099f708d-df32-4d12-ba53-e291e5a98776]: (4, ('Mon Jan 26 08:55:22 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6 (70ad2cfa53eb5b335ce07a80b07510da63eb82f125ac33829ef1fa1775492876)\n70ad2cfa53eb5b335ce07a80b07510da63eb82f125ac33829ef1fa1775492876\nMon Jan 26 08:55:22 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6 (70ad2cfa53eb5b335ce07a80b07510da63eb82f125ac33829ef1fa1775492876)\n70ad2cfa53eb5b335ce07a80b07510da63eb82f125ac33829ef1fa1775492876\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:55:22 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:22.506 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[cbb7cce8-471b-4b68-90f3-ac723e125c63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:55:22 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:22.508 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcdbb53a4-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.510 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:22 compute-1 kernel: tapcdbb53a4-d0: left promiscuous mode
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.513 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:22 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:22.517 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[9458c7e0-4b03-4a06-b32a-0b1226fe75ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.526 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:22 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:22.541 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[9fb8950e-df5c-4107-8d50-f11b9c95cb25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:55:22 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:22.543 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[d07b4a7f-afb2-4032-9bc8-b98a00c1f490]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:55:22 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:22.565 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[6de03fc5-2855-4c35-a067-6718fef99952]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395587, 'reachable_time': 38181, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218203, 'error': None, 'target': 'ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:55:22 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:22.569 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cdbb53a4-d6e2-4117-a1d9-17121bfaf8b6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 08:55:22 compute-1 systemd[1]: run-netns-ovnmeta\x2dcdbb53a4\x2dd6e2\x2d4117\x2da1d9\x2d17121bfaf8b6.mount: Deactivated successfully.
Jan 26 08:55:22 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:22.569 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[c2fc8fdd-676d-430d-85a1-00cfcaeebbcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:55:22 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:22.851 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:55:22 compute-1 nova_compute[183083]: 2026-01-26 08:55:22.851 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:22 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:22.853 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:55:23 compute-1 nova_compute[183083]: 2026-01-26 08:55:23.806 183087 DEBUG nova.network.neutron [-] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:55:23 compute-1 nova_compute[183083]: 2026-01-26 08:55:23.832 183087 INFO nova.compute.manager [-] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Took 1.35 seconds to deallocate network for instance.
Jan 26 08:55:23 compute-1 nova_compute[183083]: 2026-01-26 08:55:23.878 183087 DEBUG oslo_concurrency.lockutils [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:55:23 compute-1 nova_compute[183083]: 2026-01-26 08:55:23.879 183087 DEBUG oslo_concurrency.lockutils [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:55:23 compute-1 nova_compute[183083]: 2026-01-26 08:55:23.897 183087 DEBUG nova.compute.manager [req-7647a218-1811-4b64-a0f9-636153b73b15 req-d6aa6122-330e-4cff-8ee3-8331bd523287 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Received event network-vif-deleted-183afcdd-e728-4c4e-b370-a9d4517b3c30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:55:23 compute-1 nova_compute[183083]: 2026-01-26 08:55:23.964 183087 DEBUG nova.compute.provider_tree [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:55:23 compute-1 nova_compute[183083]: 2026-01-26 08:55:23.981 183087 DEBUG nova.scheduler.client.report [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:55:24 compute-1 nova_compute[183083]: 2026-01-26 08:55:24.004 183087 DEBUG oslo_concurrency.lockutils [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:55:24 compute-1 nova_compute[183083]: 2026-01-26 08:55:24.036 183087 INFO nova.scheduler.client.report [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Deleted allocations for instance bdc88e07-80b0-4781-8f76-fa751b8b7000
Jan 26 08:55:24 compute-1 nova_compute[183083]: 2026-01-26 08:55:24.106 183087 DEBUG oslo_concurrency.lockutils [None req-120cc92a-0d8f-4fea-8101-1dbf64f8a233 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "bdc88e07-80b0-4781-8f76-fa751b8b7000" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.019s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:55:24 compute-1 nova_compute[183083]: 2026-01-26 08:55:24.579 183087 DEBUG nova.compute.manager [req-91565911-491a-41a6-83f0-9d00d9e30a4d req-2a88796c-6409-4c0b-a2f9-c60fe8e1afce 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Received event network-vif-plugged-183afcdd-e728-4c4e-b370-a9d4517b3c30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:55:24 compute-1 nova_compute[183083]: 2026-01-26 08:55:24.580 183087 DEBUG oslo_concurrency.lockutils [req-91565911-491a-41a6-83f0-9d00d9e30a4d req-2a88796c-6409-4c0b-a2f9-c60fe8e1afce 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "bdc88e07-80b0-4781-8f76-fa751b8b7000-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:55:24 compute-1 nova_compute[183083]: 2026-01-26 08:55:24.580 183087 DEBUG oslo_concurrency.lockutils [req-91565911-491a-41a6-83f0-9d00d9e30a4d req-2a88796c-6409-4c0b-a2f9-c60fe8e1afce 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "bdc88e07-80b0-4781-8f76-fa751b8b7000-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:55:24 compute-1 nova_compute[183083]: 2026-01-26 08:55:24.581 183087 DEBUG oslo_concurrency.lockutils [req-91565911-491a-41a6-83f0-9d00d9e30a4d req-2a88796c-6409-4c0b-a2f9-c60fe8e1afce 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "bdc88e07-80b0-4781-8f76-fa751b8b7000-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:55:24 compute-1 nova_compute[183083]: 2026-01-26 08:55:24.581 183087 DEBUG nova.compute.manager [req-91565911-491a-41a6-83f0-9d00d9e30a4d req-2a88796c-6409-4c0b-a2f9-c60fe8e1afce 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] No waiting events found dispatching network-vif-plugged-183afcdd-e728-4c4e-b370-a9d4517b3c30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:55:24 compute-1 nova_compute[183083]: 2026-01-26 08:55:24.581 183087 WARNING nova.compute.manager [req-91565911-491a-41a6-83f0-9d00d9e30a4d req-2a88796c-6409-4c0b-a2f9-c60fe8e1afce 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Received unexpected event network-vif-plugged-183afcdd-e728-4c4e-b370-a9d4517b3c30 for instance with vm_state deleted and task_state None.
Jan 26 08:55:24 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:24.856 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.313 183087 DEBUG oslo_concurrency.lockutils [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.314 183087 DEBUG oslo_concurrency.lockutils [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.314 183087 DEBUG oslo_concurrency.lockutils [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.314 183087 DEBUG oslo_concurrency.lockutils [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.315 183087 DEBUG oslo_concurrency.lockutils [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.316 183087 INFO nova.compute.manager [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Terminating instance
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.317 183087 DEBUG nova.compute.manager [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:55:25 compute-1 kernel: tap3fca56be-13 (unregistering): left promiscuous mode
Jan 26 08:55:25 compute-1 NetworkManager[55451]: <info>  [1769417725.3441] device (tap3fca56be-13): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.357 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:25 compute-1 ovn_controller[95352]: 2026-01-26T08:55:25Z|00226|binding|INFO|Releasing lport 3fca56be-1386-4c85-9cee-d17f63594484 from this chassis (sb_readonly=0)
Jan 26 08:55:25 compute-1 ovn_controller[95352]: 2026-01-26T08:55:25Z|00227|binding|INFO|Setting lport 3fca56be-1386-4c85-9cee-d17f63594484 down in Southbound
Jan 26 08:55:25 compute-1 ovn_controller[95352]: 2026-01-26T08:55:25Z|00228|binding|INFO|Removing iface tap3fca56be-13 ovn-installed in OVS
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.361 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:25 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:25.370 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:c2:1f 10.100.0.5'], port_security=['fa:16:3e:c2:c2:1f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ecac5cdf-f0c1-4352-aed6-69098da46fcd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5d0c78b7cd584e4a90592d8ea01ce4ad', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a3ba4c82-55c4-4fcf-ba72-7b36eb733cd3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.189'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97b37269-bb63-498a-80e2-0f1154aea97c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=3fca56be-1386-4c85-9cee-d17f63594484) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:55:25 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:25.372 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 3fca56be-1386-4c85-9cee-d17f63594484 in datapath ce3cd186-bdaf-40d4-a276-e9139fe3dfec unbound from our chassis
Jan 26 08:55:25 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:25.375 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ce3cd186-bdaf-40d4-a276-e9139fe3dfec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 08:55:25 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:25.376 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b6731827-2181-4be4-985b-51b255bbb884]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:55:25 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:25.377 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec namespace which is not needed anymore
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.431 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:25 compute-1 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000029.scope: Deactivated successfully.
Jan 26 08:55:25 compute-1 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000029.scope: Consumed 17.849s CPU time.
Jan 26 08:55:25 compute-1 systemd-machined[154360]: Machine qemu-12-instance-00000029 terminated.
Jan 26 08:55:25 compute-1 NetworkManager[55451]: <info>  [1769417725.5388] manager: (tap3fca56be-13): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.541 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.548 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:25 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[217336]: [NOTICE]   (217340) : haproxy version is 2.8.14-c23fe91
Jan 26 08:55:25 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[217336]: [NOTICE]   (217340) : path to executable is /usr/sbin/haproxy
Jan 26 08:55:25 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[217336]: [WARNING]  (217340) : Exiting Master process...
Jan 26 08:55:25 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[217336]: [WARNING]  (217340) : Exiting Master process...
Jan 26 08:55:25 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[217336]: [ALERT]    (217340) : Current worker (217342) exited with code 143 (Terminated)
Jan 26 08:55:25 compute-1 neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec[217336]: [WARNING]  (217340) : All workers exited. Exiting... (0)
Jan 26 08:55:25 compute-1 systemd[1]: libpod-b72e0b7eadad308ec7240c1f5ee90e45645d7b7d245e25873689ed168f46e6b4.scope: Deactivated successfully.
Jan 26 08:55:25 compute-1 podman[218228]: 2026-01-26 08:55:25.560507981 +0000 UTC m=+0.064694860 container died b72e0b7eadad308ec7240c1f5ee90e45645d7b7d245e25873689ed168f46e6b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.579 183087 INFO nova.virt.libvirt.driver [-] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Instance destroyed successfully.
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.581 183087 DEBUG nova.objects.instance [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lazy-loading 'resources' on Instance uuid ecac5cdf-f0c1-4352-aed6-69098da46fcd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:55:25 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b72e0b7eadad308ec7240c1f5ee90e45645d7b7d245e25873689ed168f46e6b4-userdata-shm.mount: Deactivated successfully.
Jan 26 08:55:25 compute-1 systemd[1]: var-lib-containers-storage-overlay-2be71656cc2bd3a1fa2aae6a101e1095114b35e78ac0c31978658c67b61d80a2-merged.mount: Deactivated successfully.
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.597 183087 DEBUG nova.virt.libvirt.vif [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T08:53:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-2131100295',display_name='tempest-server-test-2131100295',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-2131100295',id=41,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDY4V6+HgiJjlBp+sismnH//xIhvaHIyZ883PC4cnliN7XgVilpp1kBxLjUHPWqW57B2esyV5Ub1B2t4CQ7Ool6UJEN5MlISleL6j0D4sAsLrzbr7gNivUWbODLUMAl3Sw==',key_name='tempest-keypair-test-1986375941',keypairs=<?>,launch_index=0,launched_at=2026-01-26T08:53:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5d0c78b7cd584e4a90592d8ea01ce4ad',ramdisk_id='',reservation_id='r-v0teuk06',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkDefaultSecGroupTest-1876093813',owner_user_name='tempest-NetworkDefaultSecGroupTest-1876093813-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T08:53:12Z,user_data=None,user_id='988ebc31182f4c94813f94306e399a2d',uuid=ecac5cdf-f0c1-4352-aed6-69098da46fcd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3fca56be-1386-4c85-9cee-d17f63594484", "address": "fa:16:3e:c2:c2:1f", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fca56be-13", "ovs_interfaceid": "3fca56be-1386-4c85-9cee-d17f63594484", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.598 183087 DEBUG nova.network.os_vif_util [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converting VIF {"id": "3fca56be-1386-4c85-9cee-d17f63594484", "address": "fa:16:3e:c2:c2:1f", "network": {"id": "ce3cd186-bdaf-40d4-a276-e9139fe3dfec", "bridge": "br-int", "label": "tempest-test-network--1457851416", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d0c78b7cd584e4a90592d8ea01ce4ad", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fca56be-13", "ovs_interfaceid": "3fca56be-1386-4c85-9cee-d17f63594484", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:55:25 compute-1 podman[218228]: 2026-01-26 08:55:25.59869172 +0000 UTC m=+0.102878599 container cleanup b72e0b7eadad308ec7240c1f5ee90e45645d7b7d245e25873689ed168f46e6b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.599 183087 DEBUG nova.network.os_vif_util [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c2:c2:1f,bridge_name='br-int',has_traffic_filtering=True,id=3fca56be-1386-4c85-9cee-d17f63594484,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fca56be-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.600 183087 DEBUG os_vif [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:c2:1f,bridge_name='br-int',has_traffic_filtering=True,id=3fca56be-1386-4c85-9cee-d17f63594484,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fca56be-13') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.604 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.605 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3fca56be-13, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.609 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.611 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:25 compute-1 systemd[1]: libpod-conmon-b72e0b7eadad308ec7240c1f5ee90e45645d7b7d245e25873689ed168f46e6b4.scope: Deactivated successfully.
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.614 183087 INFO os_vif [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:c2:1f,bridge_name='br-int',has_traffic_filtering=True,id=3fca56be-1386-4c85-9cee-d17f63594484,network=Network(ce3cd186-bdaf-40d4-a276-e9139fe3dfec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fca56be-13')
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.615 183087 INFO nova.virt.libvirt.driver [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Deleting instance files /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd_del
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.616 183087 INFO nova.virt.libvirt.driver [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Deletion of /var/lib/nova/instances/ecac5cdf-f0c1-4352-aed6-69098da46fcd_del complete
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.662 183087 INFO nova.compute.manager [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.663 183087 DEBUG oslo.service.loopingcall [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.664 183087 DEBUG nova.compute.manager [-] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.664 183087 DEBUG nova.network.neutron [-] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:55:25 compute-1 podman[218274]: 2026-01-26 08:55:25.673522164 +0000 UTC m=+0.046909996 container remove b72e0b7eadad308ec7240c1f5ee90e45645d7b7d245e25873689ed168f46e6b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 26 08:55:25 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:25.681 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[ee2a9285-72e5-4e89-9ed8-8f853fcec958]: (4, ('Mon Jan 26 08:55:25 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec (b72e0b7eadad308ec7240c1f5ee90e45645d7b7d245e25873689ed168f46e6b4)\nb72e0b7eadad308ec7240c1f5ee90e45645d7b7d245e25873689ed168f46e6b4\nMon Jan 26 08:55:25 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec (b72e0b7eadad308ec7240c1f5ee90e45645d7b7d245e25873689ed168f46e6b4)\nb72e0b7eadad308ec7240c1f5ee90e45645d7b7d245e25873689ed168f46e6b4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:55:25 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:25.683 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b4e9d44a-d1dd-4560-ae32-b5a2fd20b104]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:55:25 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:25.684 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce3cd186-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.686 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:25 compute-1 kernel: tapce3cd186-b0: left promiscuous mode
Jan 26 08:55:25 compute-1 nova_compute[183083]: 2026-01-26 08:55:25.710 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:25 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:25.713 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[66638846-5138-4fb4-899c-bbc54b5c60a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:55:25 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:25.731 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[e03cb990-e852-4c57-a943-959ce45a569a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:55:25 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:25.732 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[284cf9e7-db05-4cdb-8b48-aa9c7186566d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:55:25 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:25.750 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[cabede02-1091-4a61-95e0-87734344c207]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393236, 'reachable_time': 16687, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218289, 'error': None, 'target': 'ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:55:25 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:25.753 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ce3cd186-bdaf-40d4-a276-e9139fe3dfec deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 08:55:25 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:55:25.753 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[69f0cadf-2c22-4036-aa05-bdd030b896ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:55:25 compute-1 systemd[1]: run-netns-ovnmeta\x2dce3cd186\x2dbdaf\x2d40d4\x2da276\x2de9139fe3dfec.mount: Deactivated successfully.
Jan 26 08:55:26 compute-1 nova_compute[183083]: 2026-01-26 08:55:26.128 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:27 compute-1 nova_compute[183083]: 2026-01-26 08:55:27.112 183087 DEBUG nova.compute.manager [req-809f5d6c-3e6a-40fd-a519-bac3b612db82 req-24c35837-1e53-4202-9605-1779782303f2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Received event network-vif-unplugged-3fca56be-1386-4c85-9cee-d17f63594484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:55:27 compute-1 nova_compute[183083]: 2026-01-26 08:55:27.112 183087 DEBUG oslo_concurrency.lockutils [req-809f5d6c-3e6a-40fd-a519-bac3b612db82 req-24c35837-1e53-4202-9605-1779782303f2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:55:27 compute-1 nova_compute[183083]: 2026-01-26 08:55:27.113 183087 DEBUG oslo_concurrency.lockutils [req-809f5d6c-3e6a-40fd-a519-bac3b612db82 req-24c35837-1e53-4202-9605-1779782303f2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:55:27 compute-1 nova_compute[183083]: 2026-01-26 08:55:27.113 183087 DEBUG oslo_concurrency.lockutils [req-809f5d6c-3e6a-40fd-a519-bac3b612db82 req-24c35837-1e53-4202-9605-1779782303f2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:55:27 compute-1 nova_compute[183083]: 2026-01-26 08:55:27.114 183087 DEBUG nova.compute.manager [req-809f5d6c-3e6a-40fd-a519-bac3b612db82 req-24c35837-1e53-4202-9605-1779782303f2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] No waiting events found dispatching network-vif-unplugged-3fca56be-1386-4c85-9cee-d17f63594484 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:55:27 compute-1 nova_compute[183083]: 2026-01-26 08:55:27.114 183087 DEBUG nova.compute.manager [req-809f5d6c-3e6a-40fd-a519-bac3b612db82 req-24c35837-1e53-4202-9605-1779782303f2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Received event network-vif-unplugged-3fca56be-1386-4c85-9cee-d17f63594484 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 08:55:27 compute-1 nova_compute[183083]: 2026-01-26 08:55:27.783 183087 DEBUG nova.network.neutron [-] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:55:27 compute-1 nova_compute[183083]: 2026-01-26 08:55:27.809 183087 INFO nova.compute.manager [-] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Took 2.15 seconds to deallocate network for instance.
Jan 26 08:55:27 compute-1 nova_compute[183083]: 2026-01-26 08:55:27.879 183087 DEBUG oslo_concurrency.lockutils [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:55:27 compute-1 nova_compute[183083]: 2026-01-26 08:55:27.879 183087 DEBUG oslo_concurrency.lockutils [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:55:27 compute-1 nova_compute[183083]: 2026-01-26 08:55:27.921 183087 DEBUG nova.compute.provider_tree [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:55:27 compute-1 nova_compute[183083]: 2026-01-26 08:55:27.935 183087 DEBUG nova.scheduler.client.report [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:55:27 compute-1 nova_compute[183083]: 2026-01-26 08:55:27.960 183087 DEBUG oslo_concurrency.lockutils [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:55:27 compute-1 nova_compute[183083]: 2026-01-26 08:55:27.992 183087 INFO nova.scheduler.client.report [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Deleted allocations for instance ecac5cdf-f0c1-4352-aed6-69098da46fcd
Jan 26 08:55:28 compute-1 nova_compute[183083]: 2026-01-26 08:55:28.077 183087 DEBUG oslo_concurrency.lockutils [None req-62ae567b-0114-4767-9009-b1dbea81399c 988ebc31182f4c94813f94306e399a2d 5d0c78b7cd584e4a90592d8ea01ce4ad - - default default] Lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.763s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:55:29 compute-1 nova_compute[183083]: 2026-01-26 08:55:29.203 183087 DEBUG nova.compute.manager [req-d9391df8-24b5-49c3-a6d4-73eb7e9782f9 req-e70d718c-646d-4096-b970-65387b7a137a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Received event network-vif-plugged-3fca56be-1386-4c85-9cee-d17f63594484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:55:29 compute-1 nova_compute[183083]: 2026-01-26 08:55:29.203 183087 DEBUG oslo_concurrency.lockutils [req-d9391df8-24b5-49c3-a6d4-73eb7e9782f9 req-e70d718c-646d-4096-b970-65387b7a137a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:55:29 compute-1 nova_compute[183083]: 2026-01-26 08:55:29.204 183087 DEBUG oslo_concurrency.lockutils [req-d9391df8-24b5-49c3-a6d4-73eb7e9782f9 req-e70d718c-646d-4096-b970-65387b7a137a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:55:29 compute-1 nova_compute[183083]: 2026-01-26 08:55:29.204 183087 DEBUG oslo_concurrency.lockutils [req-d9391df8-24b5-49c3-a6d4-73eb7e9782f9 req-e70d718c-646d-4096-b970-65387b7a137a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "ecac5cdf-f0c1-4352-aed6-69098da46fcd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:55:29 compute-1 nova_compute[183083]: 2026-01-26 08:55:29.205 183087 DEBUG nova.compute.manager [req-d9391df8-24b5-49c3-a6d4-73eb7e9782f9 req-e70d718c-646d-4096-b970-65387b7a137a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] No waiting events found dispatching network-vif-plugged-3fca56be-1386-4c85-9cee-d17f63594484 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:55:29 compute-1 nova_compute[183083]: 2026-01-26 08:55:29.205 183087 WARNING nova.compute.manager [req-d9391df8-24b5-49c3-a6d4-73eb7e9782f9 req-e70d718c-646d-4096-b970-65387b7a137a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Received unexpected event network-vif-plugged-3fca56be-1386-4c85-9cee-d17f63594484 for instance with vm_state deleted and task_state None.
Jan 26 08:55:29 compute-1 nova_compute[183083]: 2026-01-26 08:55:29.206 183087 DEBUG nova.compute.manager [req-d9391df8-24b5-49c3-a6d4-73eb7e9782f9 req-e70d718c-646d-4096-b970-65387b7a137a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Received event network-vif-deleted-3fca56be-1386-4c85-9cee-d17f63594484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:55:30 compute-1 nova_compute[183083]: 2026-01-26 08:55:30.612 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:31 compute-1 nova_compute[183083]: 2026-01-26 08:55:31.130 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:34 compute-1 podman[218292]: 2026-01-26 08:55:34.801854415 +0000 UTC m=+0.060709406 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 08:55:34 compute-1 podman[218291]: 2026-01-26 08:55:34.838257843 +0000 UTC m=+0.103461753 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 26 08:55:35 compute-1 nova_compute[183083]: 2026-01-26 08:55:35.656 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:36 compute-1 nova_compute[183083]: 2026-01-26 08:55:36.131 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:37 compute-1 nova_compute[183083]: 2026-01-26 08:55:37.371 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769417722.3692527, bdc88e07-80b0-4781-8f76-fa751b8b7000 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:55:37 compute-1 nova_compute[183083]: 2026-01-26 08:55:37.371 183087 INFO nova.compute.manager [-] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] VM Stopped (Lifecycle Event)
Jan 26 08:55:37 compute-1 nova_compute[183083]: 2026-01-26 08:55:37.614 183087 DEBUG nova.compute.manager [None req-fd2ea03b-29fe-4839-ba5f-40481031ab5b - - - - - -] [instance: bdc88e07-80b0-4781-8f76-fa751b8b7000] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:55:38 compute-1 nova_compute[183083]: 2026-01-26 08:55:38.180 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:38 compute-1 nova_compute[183083]: 2026-01-26 08:55:38.307 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:39 compute-1 podman[218336]: 2026-01-26 08:55:39.802801534 +0000 UTC m=+0.063919326 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 08:55:39 compute-1 podman[218335]: 2026-01-26 08:55:39.823000105 +0000 UTC m=+0.089265653 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:55:39 compute-1 podman[218334]: 2026-01-26 08:55:39.835876619 +0000 UTC m=+0.105981985 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 26 08:55:40 compute-1 nova_compute[183083]: 2026-01-26 08:55:40.577 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769417725.5762258, ecac5cdf-f0c1-4352-aed6-69098da46fcd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:55:40 compute-1 nova_compute[183083]: 2026-01-26 08:55:40.578 183087 INFO nova.compute.manager [-] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] VM Stopped (Lifecycle Event)
Jan 26 08:55:40 compute-1 nova_compute[183083]: 2026-01-26 08:55:40.598 183087 DEBUG nova.compute.manager [None req-2f35eb29-b627-41b0-b2f6-fdaf0a2ae389 - - - - - -] [instance: ecac5cdf-f0c1-4352-aed6-69098da46fcd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:55:40 compute-1 nova_compute[183083]: 2026-01-26 08:55:40.706 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:41 compute-1 nova_compute[183083]: 2026-01-26 08:55:41.131 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:45 compute-1 nova_compute[183083]: 2026-01-26 08:55:45.752 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:46 compute-1 nova_compute[183083]: 2026-01-26 08:55:46.133 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:46 compute-1 sshd-session[218403]: Invalid user admin from 159.223.236.81 port 54208
Jan 26 08:55:46 compute-1 sshd-session[218403]: Connection closed by invalid user admin 159.223.236.81 port 54208 [preauth]
Jan 26 08:55:50 compute-1 nova_compute[183083]: 2026-01-26 08:55:50.755 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:50 compute-1 podman[218405]: 2026-01-26 08:55:50.805740537 +0000 UTC m=+0.069051641 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 08:55:51 compute-1 nova_compute[183083]: 2026-01-26 08:55:51.163 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:55 compute-1 nova_compute[183083]: 2026-01-26 08:55:55.758 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:55:56 compute-1 nova_compute[183083]: 2026-01-26 08:55:56.164 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:00 compute-1 nova_compute[183083]: 2026-01-26 08:56:00.775 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:01 compute-1 nova_compute[183083]: 2026-01-26 08:56:01.166 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:56:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:56:04 compute-1 sshd-session[218431]: userauth_pubkey: signature algorithm ssh-rsa not in PubkeyAcceptedAlgorithms [preauth]
Jan 26 08:56:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:56:05.307 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:56:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:56:05.307 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:56:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:56:05.308 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:56:05 compute-1 nova_compute[183083]: 2026-01-26 08:56:05.777 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:05 compute-1 podman[218433]: 2026-01-26 08:56:05.811753013 +0000 UTC m=+0.070217365 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Jan 26 08:56:05 compute-1 podman[218434]: 2026-01-26 08:56:05.836600245 +0000 UTC m=+0.098318438 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, name=ubi9-minimal, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350)
Jan 26 08:56:06 compute-1 nova_compute[183083]: 2026-01-26 08:56:06.177 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:07 compute-1 nova_compute[183083]: 2026-01-26 08:56:07.811 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:56:07 compute-1 nova_compute[183083]: 2026-01-26 08:56:07.812 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:56:07 compute-1 nova_compute[183083]: 2026-01-26 08:56:07.812 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:56:07 compute-1 nova_compute[183083]: 2026-01-26 08:56:07.826 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 08:56:07 compute-1 nova_compute[183083]: 2026-01-26 08:56:07.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:56:07 compute-1 nova_compute[183083]: 2026-01-26 08:56:07.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:56:08 compute-1 nova_compute[183083]: 2026-01-26 08:56:08.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:56:09 compute-1 ovn_controller[95352]: 2026-01-26T08:56:09Z|00229|pinctrl|WARN|Dropped 531 log messages in last 61 seconds (most recently, 11 seconds ago) due to excessive rate
Jan 26 08:56:09 compute-1 ovn_controller[95352]: 2026-01-26T08:56:09Z|00230|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:56:09 compute-1 ovn_controller[95352]: 2026-01-26T08:56:09Z|00231|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory
Jan 26 08:56:09 compute-1 nova_compute[183083]: 2026-01-26 08:56:09.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:56:09 compute-1 nova_compute[183083]: 2026-01-26 08:56:09.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:56:10 compute-1 nova_compute[183083]: 2026-01-26 08:56:10.803 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:10 compute-1 podman[218475]: 2026-01-26 08:56:10.842110463 +0000 UTC m=+0.091932428 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:56:10 compute-1 podman[218474]: 2026-01-26 08:56:10.873450148 +0000 UTC m=+0.124195599 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller)
Jan 26 08:56:10 compute-1 podman[218476]: 2026-01-26 08:56:10.878563503 +0000 UTC m=+0.125006923 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 08:56:10 compute-1 nova_compute[183083]: 2026-01-26 08:56:10.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:56:10 compute-1 nova_compute[183083]: 2026-01-26 08:56:10.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:56:10 compute-1 nova_compute[183083]: 2026-01-26 08:56:10.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:56:11 compute-1 nova_compute[183083]: 2026-01-26 08:56:11.180 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:11 compute-1 nova_compute[183083]: 2026-01-26 08:56:11.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:56:11 compute-1 nova_compute[183083]: 2026-01-26 08:56:11.986 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:56:11 compute-1 nova_compute[183083]: 2026-01-26 08:56:11.986 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:56:11 compute-1 nova_compute[183083]: 2026-01-26 08:56:11.987 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:56:11 compute-1 nova_compute[183083]: 2026-01-26 08:56:11.987 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:56:12 compute-1 nova_compute[183083]: 2026-01-26 08:56:12.208 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:56:12 compute-1 nova_compute[183083]: 2026-01-26 08:56:12.209 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13773MB free_disk=113.09375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:56:12 compute-1 nova_compute[183083]: 2026-01-26 08:56:12.210 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:56:12 compute-1 nova_compute[183083]: 2026-01-26 08:56:12.210 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:56:12 compute-1 nova_compute[183083]: 2026-01-26 08:56:12.273 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:56:12 compute-1 nova_compute[183083]: 2026-01-26 08:56:12.274 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:56:12 compute-1 nova_compute[183083]: 2026-01-26 08:56:12.297 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:56:12 compute-1 nova_compute[183083]: 2026-01-26 08:56:12.312 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:56:12 compute-1 nova_compute[183083]: 2026-01-26 08:56:12.334 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:56:12 compute-1 nova_compute[183083]: 2026-01-26 08:56:12.335 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:56:13 compute-1 sshd-session[218431]: Connection closed by authenticating user root 139.19.117.130 port 37400 [preauth]
Jan 26 08:56:15 compute-1 nova_compute[183083]: 2026-01-26 08:56:15.806 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:16 compute-1 nova_compute[183083]: 2026-01-26 08:56:16.182 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:20 compute-1 nova_compute[183083]: 2026-01-26 08:56:20.810 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:21 compute-1 nova_compute[183083]: 2026-01-26 08:56:21.184 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:21 compute-1 podman[218538]: 2026-01-26 08:56:21.819755361 +0000 UTC m=+0.069389231 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 08:56:24 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:56:24.305 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:56:24 compute-1 nova_compute[183083]: 2026-01-26 08:56:24.305 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:24 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:56:24.307 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:56:25 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:56:25.311 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:56:25 compute-1 nova_compute[183083]: 2026-01-26 08:56:25.812 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:26 compute-1 nova_compute[183083]: 2026-01-26 08:56:26.186 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:30 compute-1 nova_compute[183083]: 2026-01-26 08:56:30.817 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:31 compute-1 nova_compute[183083]: 2026-01-26 08:56:31.224 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:32 compute-1 sshd-session[218562]: Accepted publickey for zuul from 38.102.83.66 port 58870 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:56:32 compute-1 systemd-logind[788]: New session 39 of user zuul.
Jan 26 08:56:32 compute-1 systemd[1]: Started Session 39 of User zuul.
Jan 26 08:56:32 compute-1 sshd-session[218562]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:56:32 compute-1 sshd-session[218565]: Connection closed by 38.102.83.66 port 58870
Jan 26 08:56:32 compute-1 sshd-session[218562]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:56:33 compute-1 systemd[1]: session-39.scope: Deactivated successfully.
Jan 26 08:56:33 compute-1 systemd-logind[788]: Session 39 logged out. Waiting for processes to exit.
Jan 26 08:56:33 compute-1 systemd-logind[788]: Removed session 39.
Jan 26 08:56:35 compute-1 nova_compute[183083]: 2026-01-26 08:56:35.819 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:36 compute-1 nova_compute[183083]: 2026-01-26 08:56:36.225 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:36 compute-1 podman[218589]: 2026-01-26 08:56:36.840245498 +0000 UTC m=+0.095107437 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 26 08:56:36 compute-1 podman[218590]: 2026-01-26 08:56:36.853277316 +0000 UTC m=+0.103372521 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, release=1755695350)
Jan 26 08:56:40 compute-1 nova_compute[183083]: 2026-01-26 08:56:40.822 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:41 compute-1 nova_compute[183083]: 2026-01-26 08:56:41.260 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:41 compute-1 sshd-session[218629]: Invalid user admin from 159.223.236.81 port 38808
Jan 26 08:56:41 compute-1 podman[218633]: 2026-01-26 08:56:41.46641369 +0000 UTC m=+0.078897270 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 08:56:41 compute-1 podman[218632]: 2026-01-26 08:56:41.481804565 +0000 UTC m=+0.088778800 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 26 08:56:41 compute-1 podman[218631]: 2026-01-26 08:56:41.526753664 +0000 UTC m=+0.137388582 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 08:56:41 compute-1 sshd-session[218629]: Connection closed by invalid user admin 159.223.236.81 port 38808 [preauth]
Jan 26 08:56:45 compute-1 nova_compute[183083]: 2026-01-26 08:56:45.862 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:46 compute-1 nova_compute[183083]: 2026-01-26 08:56:46.262 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:50 compute-1 nova_compute[183083]: 2026-01-26 08:56:50.865 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:51 compute-1 nova_compute[183083]: 2026-01-26 08:56:51.264 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:52 compute-1 podman[218697]: 2026-01-26 08:56:52.808641757 +0000 UTC m=+0.066410987 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 08:56:55 compute-1 nova_compute[183083]: 2026-01-26 08:56:55.869 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:56 compute-1 nova_compute[183083]: 2026-01-26 08:56:56.280 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:56:57 compute-1 nova_compute[183083]: 2026-01-26 08:56:57.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:56:57 compute-1 nova_compute[183083]: 2026-01-26 08:56:57.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 26 08:56:57 compute-1 nova_compute[183083]: 2026-01-26 08:56:57.968 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 26 08:57:00 compute-1 nova_compute[183083]: 2026-01-26 08:57:00.873 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:01 compute-1 nova_compute[183083]: 2026-01-26 08:57:01.325 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:57:05.308 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:57:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:57:05.309 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:57:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:57:05.309 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:57:05 compute-1 nova_compute[183083]: 2026-01-26 08:57:05.876 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:06 compute-1 nova_compute[183083]: 2026-01-26 08:57:06.327 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:07 compute-1 podman[218722]: 2026-01-26 08:57:07.840221085 +0000 UTC m=+0.092410942 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., release=1755695350, vcs-type=git, config_id=openstack_network_exporter, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 08:57:07 compute-1 podman[218721]: 2026-01-26 08:57:07.844529727 +0000 UTC m=+0.101585451 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 08:57:07 compute-1 nova_compute[183083]: 2026-01-26 08:57:07.964 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:57:07 compute-1 nova_compute[183083]: 2026-01-26 08:57:07.982 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:57:07 compute-1 nova_compute[183083]: 2026-01-26 08:57:07.982 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:57:07 compute-1 nova_compute[183083]: 2026-01-26 08:57:07.982 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:57:08 compute-1 nova_compute[183083]: 2026-01-26 08:57:08.008 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 08:57:08 compute-1 nova_compute[183083]: 2026-01-26 08:57:08.008 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:57:08 compute-1 nova_compute[183083]: 2026-01-26 08:57:08.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:57:08 compute-1 nova_compute[183083]: 2026-01-26 08:57:08.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:57:09 compute-1 nova_compute[183083]: 2026-01-26 08:57:09.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:57:10 compute-1 nova_compute[183083]: 2026-01-26 08:57:10.902 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:10 compute-1 nova_compute[183083]: 2026-01-26 08:57:10.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:57:11 compute-1 nova_compute[183083]: 2026-01-26 08:57:11.330 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:11 compute-1 ovn_controller[95352]: 2026-01-26T08:57:11Z|00232|pinctrl|WARN|Dropped 43 log messages in last 62 seconds (most recently, 13 seconds ago) due to excessive rate
Jan 26 08:57:11 compute-1 ovn_controller[95352]: 2026-01-26T08:57:11Z|00233|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:57:11 compute-1 podman[218761]: 2026-01-26 08:57:11.804535273 +0000 UTC m=+0.065256424 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 08:57:11 compute-1 podman[218762]: 2026-01-26 08:57:11.811959273 +0000 UTC m=+0.066444858 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 08:57:11 compute-1 podman[218760]: 2026-01-26 08:57:11.830798625 +0000 UTC m=+0.095546470 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 08:57:11 compute-1 nova_compute[183083]: 2026-01-26 08:57:11.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:57:11 compute-1 nova_compute[183083]: 2026-01-26 08:57:11.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:57:11 compute-1 nova_compute[183083]: 2026-01-26 08:57:11.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 26 08:57:12 compute-1 nova_compute[183083]: 2026-01-26 08:57:12.548 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:57:12 compute-1 nova_compute[183083]: 2026-01-26 08:57:12.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:57:12 compute-1 nova_compute[183083]: 2026-01-26 08:57:12.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:57:13 compute-1 nova_compute[183083]: 2026-01-26 08:57:13.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:57:13 compute-1 nova_compute[183083]: 2026-01-26 08:57:13.972 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:57:13 compute-1 nova_compute[183083]: 2026-01-26 08:57:13.973 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:57:13 compute-1 nova_compute[183083]: 2026-01-26 08:57:13.973 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:57:13 compute-1 nova_compute[183083]: 2026-01-26 08:57:13.973 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:57:14 compute-1 nova_compute[183083]: 2026-01-26 08:57:14.153 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:57:14 compute-1 nova_compute[183083]: 2026-01-26 08:57:14.156 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13770MB free_disk=113.09380340576172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:57:14 compute-1 nova_compute[183083]: 2026-01-26 08:57:14.156 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:57:14 compute-1 nova_compute[183083]: 2026-01-26 08:57:14.156 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:57:14 compute-1 nova_compute[183083]: 2026-01-26 08:57:14.353 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:57:14 compute-1 nova_compute[183083]: 2026-01-26 08:57:14.354 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:57:14 compute-1 nova_compute[183083]: 2026-01-26 08:57:14.441 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:57:14 compute-1 nova_compute[183083]: 2026-01-26 08:57:14.459 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:57:14 compute-1 nova_compute[183083]: 2026-01-26 08:57:14.462 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:57:14 compute-1 nova_compute[183083]: 2026-01-26 08:57:14.462 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:57:15 compute-1 nova_compute[183083]: 2026-01-26 08:57:15.905 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:16 compute-1 nova_compute[183083]: 2026-01-26 08:57:16.385 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:19 compute-1 nova_compute[183083]: 2026-01-26 08:57:19.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:57:20 compute-1 sshd-session[218827]: Invalid user solv from 2.57.122.238 port 37630
Jan 26 08:57:20 compute-1 nova_compute[183083]: 2026-01-26 08:57:20.909 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:20 compute-1 sshd-session[218827]: Connection closed by invalid user solv 2.57.122.238 port 37630 [preauth]
Jan 26 08:57:21 compute-1 nova_compute[183083]: 2026-01-26 08:57:21.431 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:23 compute-1 podman[218829]: 2026-01-26 08:57:23.824020101 +0000 UTC m=+0.080757123 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 08:57:25 compute-1 nova_compute[183083]: 2026-01-26 08:57:25.912 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:26 compute-1 nova_compute[183083]: 2026-01-26 08:57:26.433 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:30 compute-1 nova_compute[183083]: 2026-01-26 08:57:30.915 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:31 compute-1 nova_compute[183083]: 2026-01-26 08:57:31.434 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:57:35.680 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:57:35 compute-1 nova_compute[183083]: 2026-01-26 08:57:35.680 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:57:35.682 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:57:35 compute-1 nova_compute[183083]: 2026-01-26 08:57:35.917 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:36 compute-1 nova_compute[183083]: 2026-01-26 08:57:36.437 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:38 compute-1 podman[218853]: 2026-01-26 08:57:38.799836454 +0000 UTC m=+0.063176962 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 08:57:38 compute-1 podman[218854]: 2026-01-26 08:57:38.822523507 +0000 UTC m=+0.072595339 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_id=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=)
Jan 26 08:57:40 compute-1 sshd-session[218891]: Invalid user admin from 159.223.236.81 port 36634
Jan 26 08:57:40 compute-1 sshd-session[218891]: Connection closed by invalid user admin 159.223.236.81 port 36634 [preauth]
Jan 26 08:57:40 compute-1 nova_compute[183083]: 2026-01-26 08:57:40.971 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:41 compute-1 nova_compute[183083]: 2026-01-26 08:57:41.439 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:42 compute-1 podman[218895]: 2026-01-26 08:57:42.823811144 +0000 UTC m=+0.077405354 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 08:57:42 compute-1 podman[218894]: 2026-01-26 08:57:42.838295355 +0000 UTC m=+0.097816023 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 08:57:42 compute-1 podman[218893]: 2026-01-26 08:57:42.864810166 +0000 UTC m=+0.119968260 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 08:57:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:57:44.685 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:57:45 compute-1 nova_compute[183083]: 2026-01-26 08:57:45.976 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:46 compute-1 nova_compute[183083]: 2026-01-26 08:57:46.440 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:50 compute-1 sshd-session[218956]: Accepted publickey for zuul from 38.102.83.66 port 46486 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:57:50 compute-1 systemd-logind[788]: New session 40 of user zuul.
Jan 26 08:57:50 compute-1 systemd[1]: Started Session 40 of User zuul.
Jan 26 08:57:50 compute-1 sshd-session[218956]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:57:50 compute-1 sshd-session[218959]: Connection closed by 38.102.83.66 port 46486
Jan 26 08:57:50 compute-1 sshd-session[218956]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:57:50 compute-1 systemd[1]: session-40.scope: Deactivated successfully.
Jan 26 08:57:50 compute-1 systemd-logind[788]: Session 40 logged out. Waiting for processes to exit.
Jan 26 08:57:50 compute-1 systemd-logind[788]: Removed session 40.
Jan 26 08:57:50 compute-1 nova_compute[183083]: 2026-01-26 08:57:50.980 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:51 compute-1 nova_compute[183083]: 2026-01-26 08:57:51.442 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:54 compute-1 podman[218983]: 2026-01-26 08:57:54.852523852 +0000 UTC m=+0.103718820 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 08:57:55 compute-1 nova_compute[183083]: 2026-01-26 08:57:55.983 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:57:56 compute-1 nova_compute[183083]: 2026-01-26 08:57:56.480 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:00 compute-1 nova_compute[183083]: 2026-01-26 08:58:00.988 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:01 compute-1 nova_compute[183083]: 2026-01-26 08:58:01.481 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.744 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.745 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.745 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.745 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.745 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 08:58:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 08:58:03 compute-1 ovn_controller[95352]: 2026-01-26T08:58:03Z|00234|pinctrl|WARN|Dropped 149 log messages in last 52 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 26 08:58:03 compute-1 ovn_controller[95352]: 2026-01-26T08:58:03Z|00235|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:58:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:58:05.310 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:58:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:58:05.311 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:58:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:58:05.311 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:58:05 compute-1 nova_compute[183083]: 2026-01-26 08:58:05.991 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:06 compute-1 nova_compute[183083]: 2026-01-26 08:58:06.483 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:07 compute-1 nova_compute[183083]: 2026-01-26 08:58:07.840 183087 DEBUG oslo_concurrency.lockutils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Acquiring lock "d266aeea-8f31-46fb-8006-4e97165d270b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:58:07 compute-1 nova_compute[183083]: 2026-01-26 08:58:07.841 183087 DEBUG oslo_concurrency.lockutils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "d266aeea-8f31-46fb-8006-4e97165d270b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:58:07 compute-1 nova_compute[183083]: 2026-01-26 08:58:07.859 183087 DEBUG nova.compute.manager [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:58:07 compute-1 nova_compute[183083]: 2026-01-26 08:58:07.977 183087 DEBUG oslo_concurrency.lockutils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:58:07 compute-1 nova_compute[183083]: 2026-01-26 08:58:07.978 183087 DEBUG oslo_concurrency.lockutils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:58:07 compute-1 nova_compute[183083]: 2026-01-26 08:58:07.984 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:58:07 compute-1 nova_compute[183083]: 2026-01-26 08:58:07.985 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:58:07 compute-1 nova_compute[183083]: 2026-01-26 08:58:07.985 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:58:07 compute-1 nova_compute[183083]: 2026-01-26 08:58:07.992 183087 DEBUG nova.virt.hardware [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:58:07 compute-1 nova_compute[183083]: 2026-01-26 08:58:07.992 183087 INFO nova.compute.claims [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:58:08 compute-1 nova_compute[183083]: 2026-01-26 08:58:08.033 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 08:58:08 compute-1 nova_compute[183083]: 2026-01-26 08:58:08.129 183087 DEBUG nova.compute.provider_tree [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:58:08 compute-1 nova_compute[183083]: 2026-01-26 08:58:08.146 183087 DEBUG nova.scheduler.client.report [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:58:08 compute-1 nova_compute[183083]: 2026-01-26 08:58:08.171 183087 DEBUG oslo_concurrency.lockutils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:58:08 compute-1 nova_compute[183083]: 2026-01-26 08:58:08.172 183087 DEBUG nova.compute.manager [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:58:08 compute-1 nova_compute[183083]: 2026-01-26 08:58:08.222 183087 DEBUG nova.compute.manager [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:58:08 compute-1 nova_compute[183083]: 2026-01-26 08:58:08.222 183087 DEBUG nova.network.neutron [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:58:08 compute-1 nova_compute[183083]: 2026-01-26 08:58:08.244 183087 INFO nova.virt.libvirt.driver [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:58:08 compute-1 nova_compute[183083]: 2026-01-26 08:58:08.265 183087 DEBUG nova.compute.manager [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:58:08 compute-1 nova_compute[183083]: 2026-01-26 08:58:08.362 183087 DEBUG nova.compute.manager [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:58:08 compute-1 nova_compute[183083]: 2026-01-26 08:58:08.364 183087 DEBUG nova.virt.libvirt.driver [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:58:08 compute-1 nova_compute[183083]: 2026-01-26 08:58:08.365 183087 INFO nova.virt.libvirt.driver [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Creating image(s)
Jan 26 08:58:08 compute-1 nova_compute[183083]: 2026-01-26 08:58:08.366 183087 DEBUG oslo_concurrency.lockutils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Acquiring lock "/var/lib/nova/instances/d266aeea-8f31-46fb-8006-4e97165d270b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:58:08 compute-1 nova_compute[183083]: 2026-01-26 08:58:08.366 183087 DEBUG oslo_concurrency.lockutils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "/var/lib/nova/instances/d266aeea-8f31-46fb-8006-4e97165d270b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:58:08 compute-1 nova_compute[183083]: 2026-01-26 08:58:08.367 183087 DEBUG oslo_concurrency.lockutils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "/var/lib/nova/instances/d266aeea-8f31-46fb-8006-4e97165d270b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:58:08 compute-1 nova_compute[183083]: 2026-01-26 08:58:08.368 183087 DEBUG oslo_concurrency.lockutils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:58:08 compute-1 nova_compute[183083]: 2026-01-26 08:58:08.369 183087 DEBUG oslo_concurrency.lockutils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:58:08 compute-1 nova_compute[183083]: 2026-01-26 08:58:08.607 183087 DEBUG nova.policy [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6a6aecbc792d4a2c89d40ccdc353a3a8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '957ef341e91c400e8ab292571f04481e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.410 183087 DEBUG oslo_concurrency.lockutils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.041s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Traceback (most recent call last):
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]     raise exception.ImageUnacceptable(
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b] 
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b] During handling of the above exception, another exception occurred:
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b] 
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Traceback (most recent call last):
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]     yield resources
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]     created_disks = self._create_and_inject_local_root(
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]     image.cache(fetch_func=fetch_func,
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]     return f(*args, **kwargs)
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b]     raise exception.ImageUnacceptable(
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.411 183087 ERROR nova.compute.manager [instance: d266aeea-8f31-46fb-8006-4e97165d270b] 
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.615 183087 DEBUG nova.network.neutron [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Successfully created port: 6cb9afe8-7993-4016-b9e9-e55b779a6645 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:58:09 compute-1 podman[219007]: 2026-01-26 08:58:09.817127419 +0000 UTC m=+0.071361823 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 26 08:58:09 compute-1 podman[219008]: 2026-01-26 08:58:09.827862113 +0000 UTC m=+0.083133527 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, architecture=x86_64, release=1755695350, version=9.6, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:58:09 compute-1 nova_compute[183083]: 2026-01-26 08:58:09.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:58:10 compute-1 nova_compute[183083]: 2026-01-26 08:58:10.359 183087 DEBUG nova.network.neutron [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Successfully updated port: 6cb9afe8-7993-4016-b9e9-e55b779a6645 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:58:10 compute-1 nova_compute[183083]: 2026-01-26 08:58:10.376 183087 DEBUG oslo_concurrency.lockutils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Acquiring lock "refresh_cache-d266aeea-8f31-46fb-8006-4e97165d270b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:58:10 compute-1 nova_compute[183083]: 2026-01-26 08:58:10.376 183087 DEBUG oslo_concurrency.lockutils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Acquired lock "refresh_cache-d266aeea-8f31-46fb-8006-4e97165d270b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:58:10 compute-1 nova_compute[183083]: 2026-01-26 08:58:10.376 183087 DEBUG nova.network.neutron [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:58:10 compute-1 nova_compute[183083]: 2026-01-26 08:58:10.536 183087 DEBUG nova.network.neutron [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:58:10 compute-1 nova_compute[183083]: 2026-01-26 08:58:10.788 183087 DEBUG nova.compute.manager [req-96b7f0fb-1ac6-46e7-97bc-174933604db5 req-7f0e4782-670a-4a23-aadf-ae80c6f3b270 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Received event network-changed-6cb9afe8-7993-4016-b9e9-e55b779a6645 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:58:10 compute-1 nova_compute[183083]: 2026-01-26 08:58:10.789 183087 DEBUG nova.compute.manager [req-96b7f0fb-1ac6-46e7-97bc-174933604db5 req-7f0e4782-670a-4a23-aadf-ae80c6f3b270 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Refreshing instance network info cache due to event network-changed-6cb9afe8-7993-4016-b9e9-e55b779a6645. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:58:10 compute-1 nova_compute[183083]: 2026-01-26 08:58:10.789 183087 DEBUG oslo_concurrency.lockutils [req-96b7f0fb-1ac6-46e7-97bc-174933604db5 req-7f0e4782-670a-4a23-aadf-ae80c6f3b270 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-d266aeea-8f31-46fb-8006-4e97165d270b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:58:10 compute-1 nova_compute[183083]: 2026-01-26 08:58:10.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:58:10 compute-1 nova_compute[183083]: 2026-01-26 08:58:10.993 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.206 183087 DEBUG nova.network.neutron [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Updating instance_info_cache with network_info: [{"id": "6cb9afe8-7993-4016-b9e9-e55b779a6645", "address": "fa:16:3e:3a:d7:dd", "network": {"id": "f3a30777-6ba2-469d-81b9-dd6d28260557", "bridge": "br-int", "label": "tempest-test-network--1085282533", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "957ef341e91c400e8ab292571f04481e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cb9afe8-79", "ovs_interfaceid": "6cb9afe8-7993-4016-b9e9-e55b779a6645", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.225 183087 DEBUG oslo_concurrency.lockutils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Releasing lock "refresh_cache-d266aeea-8f31-46fb-8006-4e97165d270b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.226 183087 DEBUG nova.compute.manager [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Instance network_info: |[{"id": "6cb9afe8-7993-4016-b9e9-e55b779a6645", "address": "fa:16:3e:3a:d7:dd", "network": {"id": "f3a30777-6ba2-469d-81b9-dd6d28260557", "bridge": "br-int", "label": "tempest-test-network--1085282533", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "957ef341e91c400e8ab292571f04481e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cb9afe8-79", "ovs_interfaceid": "6cb9afe8-7993-4016-b9e9-e55b779a6645", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.226 183087 DEBUG oslo_concurrency.lockutils [req-96b7f0fb-1ac6-46e7-97bc-174933604db5 req-7f0e4782-670a-4a23-aadf-ae80c6f3b270 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-d266aeea-8f31-46fb-8006-4e97165d270b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.227 183087 DEBUG nova.network.neutron [req-96b7f0fb-1ac6-46e7-97bc-174933604db5 req-7f0e4782-670a-4a23-aadf-ae80c6f3b270 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Refreshing network info cache for port 6cb9afe8-7993-4016-b9e9-e55b779a6645 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.229 183087 INFO nova.compute.manager [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Terminating instance
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.231 183087 DEBUG nova.compute.manager [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.238 183087 DEBUG nova.virt.libvirt.driver [-] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.238 183087 INFO nova.virt.libvirt.driver [-] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Instance destroyed successfully.
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.239 183087 DEBUG nova.virt.libvirt.vif [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:58:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-test_dvr_vip_failover_basic-1648893917',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dvr-vip-failover-basic-1648893917',id=43,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG3K6KGS+oNACDlWNN6LMII/yuxOYP4K4EBbblXWxKPzjc0p4ZbLiDkJLLrOsnoY8oJuEn2whLl2uxichhPxygIq1QJPQ//tdgXH4RuLSfWQKS3itdjEj2Y+NjIBI5Kg4Q==',key_name='tempest-keypair-test-1585635935',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='957ef341e91c400e8ab292571f04481e',ramdisk_id='',reservation_id='r-pxxwcs4y',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-OvnDvrAdvancedTest-144790097',owner_user_name='tempest-OvnDvrAdvancedTest-144790097-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:58:08Z,user_data=None,user_id='6a6aecbc792d4a2c89d40ccdc353a3a8',uuid=d266aeea-8f31-46fb-8006-4e97165d270b,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6cb9afe8-7993-4016-b9e9-e55b779a6645", "address": "fa:16:3e:3a:d7:dd", "network": {"id": "f3a30777-6ba2-469d-81b9-dd6d28260557", "bridge": "br-int", "label": "tempest-test-network--1085282533", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "957ef341e91c400e8ab292571f04481e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cb9afe8-79", "ovs_interfaceid": "6cb9afe8-7993-4016-b9e9-e55b779a6645", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.240 183087 DEBUG nova.network.os_vif_util [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Converting VIF {"id": "6cb9afe8-7993-4016-b9e9-e55b779a6645", "address": "fa:16:3e:3a:d7:dd", "network": {"id": "f3a30777-6ba2-469d-81b9-dd6d28260557", "bridge": "br-int", "label": "tempest-test-network--1085282533", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "957ef341e91c400e8ab292571f04481e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cb9afe8-79", "ovs_interfaceid": "6cb9afe8-7993-4016-b9e9-e55b779a6645", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.241 183087 DEBUG nova.network.os_vif_util [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:d7:dd,bridge_name='br-int',has_traffic_filtering=True,id=6cb9afe8-7993-4016-b9e9-e55b779a6645,network=Network(f3a30777-6ba2-469d-81b9-dd6d28260557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cb9afe8-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.242 183087 DEBUG os_vif [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:d7:dd,bridge_name='br-int',has_traffic_filtering=True,id=6cb9afe8-7993-4016-b9e9-e55b779a6645,network=Network(f3a30777-6ba2-469d-81b9-dd6d28260557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cb9afe8-79') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.244 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.244 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6cb9afe8-79, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.245 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.253 183087 INFO os_vif [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:d7:dd,bridge_name='br-int',has_traffic_filtering=True,id=6cb9afe8-7993-4016-b9e9-e55b779a6645,network=Network(f3a30777-6ba2-469d-81b9-dd6d28260557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cb9afe8-79')
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.255 183087 INFO nova.virt.libvirt.driver [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Deleting instance files /var/lib/nova/instances/d266aeea-8f31-46fb-8006-4e97165d270b_del
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.255 183087 INFO nova.virt.libvirt.driver [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Deletion of /var/lib/nova/instances/d266aeea-8f31-46fb-8006-4e97165d270b_del complete
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.308 183087 INFO nova.compute.manager [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Took 0.08 seconds to destroy the instance on the hypervisor.
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.310 183087 DEBUG nova.compute.claims [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c9855a730> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.310 183087 DEBUG oslo_concurrency.lockutils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.311 183087 DEBUG oslo_concurrency.lockutils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.413 183087 DEBUG nova.compute.provider_tree [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.431 183087 DEBUG nova.scheduler.client.report [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.458 183087 DEBUG oslo_concurrency.lockutils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.459 183087 DEBUG nova.compute.utils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.461 183087 ERROR nova.compute.manager [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Build of instance d266aeea-8f31-46fb-8006-4e97165d270b aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance d266aeea-8f31-46fb-8006-4e97165d270b aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.462 183087 DEBUG nova.compute.manager [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.463 183087 DEBUG nova.virt.libvirt.vif [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:58:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-test_dvr_vip_failover_basic-1648893917',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-test-dvr-vip-failover-basic-1648893917',id=43,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG3K6KGS+oNACDlWNN6LMII/yuxOYP4K4EBbblXWxKPzjc0p4ZbLiDkJLLrOsnoY8oJuEn2whLl2uxichhPxygIq1QJPQ//tdgXH4RuLSfWQKS3itdjEj2Y+NjIBI5Kg4Q==',key_name='tempest-keypair-test-1585635935',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='957ef341e91c400e8ab292571f04481e',ramdisk_id='',reservation_id='r-pxxwcs4y',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-OvnDvrAdvancedTest-144790097',owner_user_name='tempest-OvnDvrAdvancedTest-144790097-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:58:11Z,user_data=None,user_id='6a6aecbc792d4a2c89d40ccdc353a3a8',uuid=d266aeea-8f31-46fb-8006-4e97165d270b,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6cb9afe8-7993-4016-b9e9-e55b779a6645", "address": "fa:16:3e:3a:d7:dd", "network": {"id": "f3a30777-6ba2-469d-81b9-dd6d28260557", "bridge": "br-int", "label": "tempest-test-network--1085282533", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "957ef341e91c400e8ab292571f04481e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cb9afe8-79", "ovs_interfaceid": "6cb9afe8-7993-4016-b9e9-e55b779a6645", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.464 183087 DEBUG nova.network.os_vif_util [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Converting VIF {"id": "6cb9afe8-7993-4016-b9e9-e55b779a6645", "address": "fa:16:3e:3a:d7:dd", "network": {"id": "f3a30777-6ba2-469d-81b9-dd6d28260557", "bridge": "br-int", "label": "tempest-test-network--1085282533", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "957ef341e91c400e8ab292571f04481e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cb9afe8-79", "ovs_interfaceid": "6cb9afe8-7993-4016-b9e9-e55b779a6645", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.466 183087 DEBUG nova.network.os_vif_util [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:d7:dd,bridge_name='br-int',has_traffic_filtering=True,id=6cb9afe8-7993-4016-b9e9-e55b779a6645,network=Network(f3a30777-6ba2-469d-81b9-dd6d28260557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cb9afe8-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.466 183087 DEBUG os_vif [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:d7:dd,bridge_name='br-int',has_traffic_filtering=True,id=6cb9afe8-7993-4016-b9e9-e55b779a6645,network=Network(f3a30777-6ba2-469d-81b9-dd6d28260557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cb9afe8-79') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.469 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.469 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6cb9afe8-79, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.470 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.472 183087 INFO os_vif [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:d7:dd,bridge_name='br-int',has_traffic_filtering=True,id=6cb9afe8-7993-4016-b9e9-e55b779a6645,network=Network(f3a30777-6ba2-469d-81b9-dd6d28260557),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cb9afe8-79')
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.473 183087 DEBUG nova.compute.manager [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.474 183087 DEBUG nova.compute.manager [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.474 183087 DEBUG nova.network.neutron [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.485 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:11 compute-1 nova_compute[183083]: 2026-01-26 08:58:11.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:58:12 compute-1 nova_compute[183083]: 2026-01-26 08:58:12.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:58:12 compute-1 nova_compute[183083]: 2026-01-26 08:58:12.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:58:13 compute-1 podman[219048]: 2026-01-26 08:58:13.840636436 +0000 UTC m=+0.090133025 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 08:58:13 compute-1 podman[219049]: 2026-01-26 08:58:13.842472698 +0000 UTC m=+0.089690352 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 08:58:13 compute-1 podman[219047]: 2026-01-26 08:58:13.873010774 +0000 UTC m=+0.129577233 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 08:58:13 compute-1 nova_compute[183083]: 2026-01-26 08:58:13.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:58:13 compute-1 nova_compute[183083]: 2026-01-26 08:58:13.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:58:14 compute-1 nova_compute[183083]: 2026-01-26 08:58:14.040 183087 DEBUG nova.network.neutron [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:58:14 compute-1 nova_compute[183083]: 2026-01-26 08:58:14.063 183087 INFO nova.compute.manager [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Took 2.59 seconds to deallocate network for instance.
Jan 26 08:58:14 compute-1 nova_compute[183083]: 2026-01-26 08:58:14.182 183087 DEBUG nova.network.neutron [req-96b7f0fb-1ac6-46e7-97bc-174933604db5 req-7f0e4782-670a-4a23-aadf-ae80c6f3b270 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Updated VIF entry in instance network info cache for port 6cb9afe8-7993-4016-b9e9-e55b779a6645. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:58:14 compute-1 nova_compute[183083]: 2026-01-26 08:58:14.183 183087 DEBUG nova.network.neutron [req-96b7f0fb-1ac6-46e7-97bc-174933604db5 req-7f0e4782-670a-4a23-aadf-ae80c6f3b270 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d266aeea-8f31-46fb-8006-4e97165d270b] Updating instance_info_cache with network_info: [{"id": "6cb9afe8-7993-4016-b9e9-e55b779a6645", "address": "fa:16:3e:3a:d7:dd", "network": {"id": "f3a30777-6ba2-469d-81b9-dd6d28260557", "bridge": "br-int", "label": "tempest-test-network--1085282533", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "957ef341e91c400e8ab292571f04481e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cb9afe8-79", "ovs_interfaceid": "6cb9afe8-7993-4016-b9e9-e55b779a6645", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:58:14 compute-1 nova_compute[183083]: 2026-01-26 08:58:14.203 183087 DEBUG oslo_concurrency.lockutils [req-96b7f0fb-1ac6-46e7-97bc-174933604db5 req-7f0e4782-670a-4a23-aadf-ae80c6f3b270 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-d266aeea-8f31-46fb-8006-4e97165d270b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:58:14 compute-1 nova_compute[183083]: 2026-01-26 08:58:14.212 183087 INFO nova.scheduler.client.report [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Deleted allocations for instance d266aeea-8f31-46fb-8006-4e97165d270b
Jan 26 08:58:14 compute-1 nova_compute[183083]: 2026-01-26 08:58:14.213 183087 DEBUG oslo_concurrency.lockutils [None req-1ff17a55-d276-48c0-83eb-90ec90a9a959 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "d266aeea-8f31-46fb-8006-4e97165d270b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:58:15 compute-1 nova_compute[183083]: 2026-01-26 08:58:15.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:58:15 compute-1 nova_compute[183083]: 2026-01-26 08:58:15.974 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:58:15 compute-1 nova_compute[183083]: 2026-01-26 08:58:15.975 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:58:15 compute-1 nova_compute[183083]: 2026-01-26 08:58:15.975 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:58:15 compute-1 nova_compute[183083]: 2026-01-26 08:58:15.975 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:58:16 compute-1 nova_compute[183083]: 2026-01-26 08:58:16.053 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:16 compute-1 nova_compute[183083]: 2026-01-26 08:58:16.233 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:58:16 compute-1 nova_compute[183083]: 2026-01-26 08:58:16.234 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13786MB free_disk=113.09381866455078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:58:16 compute-1 nova_compute[183083]: 2026-01-26 08:58:16.234 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:58:16 compute-1 nova_compute[183083]: 2026-01-26 08:58:16.235 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:58:16 compute-1 nova_compute[183083]: 2026-01-26 08:58:16.319 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:58:16 compute-1 nova_compute[183083]: 2026-01-26 08:58:16.319 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:58:16 compute-1 nova_compute[183083]: 2026-01-26 08:58:16.340 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:58:16 compute-1 nova_compute[183083]: 2026-01-26 08:58:16.363 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:58:16 compute-1 nova_compute[183083]: 2026-01-26 08:58:16.484 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:58:16 compute-1 nova_compute[183083]: 2026-01-26 08:58:16.485 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:58:16 compute-1 nova_compute[183083]: 2026-01-26 08:58:16.487 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:19 compute-1 nova_compute[183083]: 2026-01-26 08:58:19.796 183087 DEBUG oslo_concurrency.lockutils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Acquiring lock "736d6d45-59ee-4524-929c-b8cd1d2195eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:58:19 compute-1 nova_compute[183083]: 2026-01-26 08:58:19.796 183087 DEBUG oslo_concurrency.lockutils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "736d6d45-59ee-4524-929c-b8cd1d2195eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:58:19 compute-1 nova_compute[183083]: 2026-01-26 08:58:19.827 183087 DEBUG nova.compute.manager [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:58:19 compute-1 nova_compute[183083]: 2026-01-26 08:58:19.884 183087 DEBUG oslo_concurrency.lockutils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:58:19 compute-1 nova_compute[183083]: 2026-01-26 08:58:19.884 183087 DEBUG oslo_concurrency.lockutils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:58:19 compute-1 nova_compute[183083]: 2026-01-26 08:58:19.889 183087 DEBUG nova.virt.hardware [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:58:19 compute-1 nova_compute[183083]: 2026-01-26 08:58:19.889 183087 INFO nova.compute.claims [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:58:19 compute-1 nova_compute[183083]: 2026-01-26 08:58:19.987 183087 DEBUG nova.compute.provider_tree [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.002 183087 DEBUG nova.scheduler.client.report [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.022 183087 DEBUG oslo_concurrency.lockutils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.022 183087 DEBUG nova.compute.manager [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.061 183087 DEBUG nova.compute.manager [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.062 183087 DEBUG nova.network.neutron [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.078 183087 INFO nova.virt.libvirt.driver [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.099 183087 DEBUG nova.compute.manager [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.181 183087 DEBUG nova.compute.manager [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.182 183087 DEBUG nova.virt.libvirt.driver [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.183 183087 INFO nova.virt.libvirt.driver [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Creating image(s)
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.183 183087 DEBUG oslo_concurrency.lockutils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Acquiring lock "/var/lib/nova/instances/736d6d45-59ee-4524-929c-b8cd1d2195eb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.184 183087 DEBUG oslo_concurrency.lockutils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "/var/lib/nova/instances/736d6d45-59ee-4524-929c-b8cd1d2195eb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.184 183087 DEBUG oslo_concurrency.lockutils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "/var/lib/nova/instances/736d6d45-59ee-4524-929c-b8cd1d2195eb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.185 183087 DEBUG oslo_concurrency.lockutils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.185 183087 DEBUG oslo_concurrency.lockutils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.573 183087 DEBUG nova.policy [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6a6aecbc792d4a2c89d40ccdc353a3a8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '957ef341e91c400e8ab292571f04481e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 DEBUG oslo_concurrency.lockutils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.783s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Traceback (most recent call last):
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]     raise exception.ImageUnacceptable(
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] 
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] During handling of the above exception, another exception occurred:
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] 
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Traceback (most recent call last):
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]     yield resources
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]     self.driver.spawn(context, instance, image_meta,
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]     created_instance_dir, created_disks = self._create_image(
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]     created_disks = self._create_and_inject_local_root(
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]     image.cache(fetch_func=fetch_func,
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]     self.create_image(fetch_func_sync, base, size,
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]     prepare_template(target=base, *args, **kwargs)
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]     return f(*args, **kwargs)
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]     fetch_func(target=target, *args, **kwargs)
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb]     raise exception.ImageUnacceptable(
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:58:20 compute-1 nova_compute[183083]: 2026-01-26 08:58:20.969 183087 ERROR nova.compute.manager [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] 
Jan 26 08:58:21 compute-1 nova_compute[183083]: 2026-01-26 08:58:21.055 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:21 compute-1 nova_compute[183083]: 2026-01-26 08:58:21.490 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:21 compute-1 nova_compute[183083]: 2026-01-26 08:58:21.612 183087 DEBUG nova.network.neutron [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Successfully created port: db3aee07-dc82-48ec-880a-55bbfeb6371d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:58:22 compute-1 nova_compute[183083]: 2026-01-26 08:58:22.317 183087 DEBUG nova.network.neutron [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Successfully updated port: db3aee07-dc82-48ec-880a-55bbfeb6371d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:58:22 compute-1 nova_compute[183083]: 2026-01-26 08:58:22.335 183087 DEBUG oslo_concurrency.lockutils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Acquiring lock "refresh_cache-736d6d45-59ee-4524-929c-b8cd1d2195eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:58:22 compute-1 nova_compute[183083]: 2026-01-26 08:58:22.336 183087 DEBUG oslo_concurrency.lockutils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Acquired lock "refresh_cache-736d6d45-59ee-4524-929c-b8cd1d2195eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:58:22 compute-1 nova_compute[183083]: 2026-01-26 08:58:22.336 183087 DEBUG nova.network.neutron [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:58:22 compute-1 nova_compute[183083]: 2026-01-26 08:58:22.396 183087 DEBUG nova.compute.manager [req-3f640a5b-d9bf-4d22-8da9-e5b8e09705de req-1140c23e-52b9-45db-8f9f-f508a9839616 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Received event network-changed-db3aee07-dc82-48ec-880a-55bbfeb6371d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:58:22 compute-1 nova_compute[183083]: 2026-01-26 08:58:22.397 183087 DEBUG nova.compute.manager [req-3f640a5b-d9bf-4d22-8da9-e5b8e09705de req-1140c23e-52b9-45db-8f9f-f508a9839616 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Refreshing instance network info cache due to event network-changed-db3aee07-dc82-48ec-880a-55bbfeb6371d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:58:22 compute-1 nova_compute[183083]: 2026-01-26 08:58:22.397 183087 DEBUG oslo_concurrency.lockutils [req-3f640a5b-d9bf-4d22-8da9-e5b8e09705de req-1140c23e-52b9-45db-8f9f-f508a9839616 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-736d6d45-59ee-4524-929c-b8cd1d2195eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:58:22 compute-1 nova_compute[183083]: 2026-01-26 08:58:22.489 183087 DEBUG nova.network.neutron [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.189 183087 DEBUG nova.network.neutron [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Updating instance_info_cache with network_info: [{"id": "db3aee07-dc82-48ec-880a-55bbfeb6371d", "address": "fa:16:3e:f8:26:39", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.215", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb3aee07-dc", "ovs_interfaceid": "db3aee07-dc82-48ec-880a-55bbfeb6371d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.210 183087 DEBUG oslo_concurrency.lockutils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Releasing lock "refresh_cache-736d6d45-59ee-4524-929c-b8cd1d2195eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.211 183087 DEBUG nova.compute.manager [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Instance network_info: |[{"id": "db3aee07-dc82-48ec-880a-55bbfeb6371d", "address": "fa:16:3e:f8:26:39", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.215", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb3aee07-dc", "ovs_interfaceid": "db3aee07-dc82-48ec-880a-55bbfeb6371d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.211 183087 DEBUG oslo_concurrency.lockutils [req-3f640a5b-d9bf-4d22-8da9-e5b8e09705de req-1140c23e-52b9-45db-8f9f-f508a9839616 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-736d6d45-59ee-4524-929c-b8cd1d2195eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.212 183087 DEBUG nova.network.neutron [req-3f640a5b-d9bf-4d22-8da9-e5b8e09705de req-1140c23e-52b9-45db-8f9f-f508a9839616 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Refreshing network info cache for port db3aee07-dc82-48ec-880a-55bbfeb6371d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.213 183087 INFO nova.compute.manager [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Terminating instance
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.214 183087 DEBUG nova.compute.manager [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.218 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.219 183087 INFO nova.virt.libvirt.driver [-] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Instance destroyed successfully.
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.219 183087 DEBUG nova.virt.libvirt.vif [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:58:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-test_dvr_vip_failover_external_network-1476139038',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dvr-vip-failover-external-network-1476139038',id=44,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG3K6KGS+oNACDlWNN6LMII/yuxOYP4K4EBbblXWxKPzjc0p4ZbLiDkJLLrOsnoY8oJuEn2whLl2uxichhPxygIq1QJPQ//tdgXH4RuLSfWQKS3itdjEj2Y+NjIBI5Kg4Q==',key_name='tempest-keypair-test-1585635935',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='957ef341e91c400e8ab292571f04481e',ramdisk_id='',reservation_id='r-ayz77tvg',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-OvnDvrAdvancedTest-144790097',owner_user_name='tempest-OvnDvrAdvancedTest-144790097-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:58:20Z,user_data=None,user_id='6a6aecbc792d4a2c89d40ccdc353a3a8',uuid=736d6d45-59ee-4524-929c-b8cd1d2195eb,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "db3aee07-dc82-48ec-880a-55bbfeb6371d", "address": "fa:16:3e:f8:26:39", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.215", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb3aee07-dc", "ovs_interfaceid": "db3aee07-dc82-48ec-880a-55bbfeb6371d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.220 183087 DEBUG nova.network.os_vif_util [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Converting VIF {"id": "db3aee07-dc82-48ec-880a-55bbfeb6371d", "address": "fa:16:3e:f8:26:39", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.215", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb3aee07-dc", "ovs_interfaceid": "db3aee07-dc82-48ec-880a-55bbfeb6371d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.221 183087 DEBUG nova.network.os_vif_util [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:26:39,bridge_name='br-int',has_traffic_filtering=True,id=db3aee07-dc82-48ec-880a-55bbfeb6371d,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb3aee07-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.221 183087 DEBUG os_vif [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:26:39,bridge_name='br-int',has_traffic_filtering=True,id=db3aee07-dc82-48ec-880a-55bbfeb6371d,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb3aee07-dc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.223 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.223 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb3aee07-dc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.224 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.227 183087 INFO os_vif [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:26:39,bridge_name='br-int',has_traffic_filtering=True,id=db3aee07-dc82-48ec-880a-55bbfeb6371d,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb3aee07-dc')
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.228 183087 INFO nova.virt.libvirt.driver [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Deleting instance files /var/lib/nova/instances/736d6d45-59ee-4524-929c-b8cd1d2195eb_del
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.229 183087 INFO nova.virt.libvirt.driver [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Deletion of /var/lib/nova/instances/736d6d45-59ee-4524-929c-b8cd1d2195eb_del complete
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.289 183087 INFO nova.compute.manager [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Took 0.07 seconds to destroy the instance on the hypervisor.
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.290 183087 DEBUG nova.compute.claims [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Aborting claim: <nova.compute.claims.Claim object at 0x7f6cc8ec2eb0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.290 183087 DEBUG oslo_concurrency.lockutils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.291 183087 DEBUG oslo_concurrency.lockutils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.416 183087 DEBUG nova.compute.provider_tree [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.430 183087 DEBUG nova.scheduler.client.report [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.447 183087 DEBUG oslo_concurrency.lockutils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.448 183087 DEBUG nova.compute.utils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.449 183087 ERROR nova.compute.manager [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Build of instance 736d6d45-59ee-4524-929c-b8cd1d2195eb aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 736d6d45-59ee-4524-929c-b8cd1d2195eb aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.449 183087 DEBUG nova.compute.manager [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.450 183087 DEBUG nova.virt.libvirt.vif [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T08:58:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-test_dvr_vip_failover_external_network-1476139038',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-test-dvr-vip-failover-external-network-1476139038',id=44,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG3K6KGS+oNACDlWNN6LMII/yuxOYP4K4EBbblXWxKPzjc0p4ZbLiDkJLLrOsnoY8oJuEn2whLl2uxichhPxygIq1QJPQ//tdgXH4RuLSfWQKS3itdjEj2Y+NjIBI5Kg4Q==',key_name='tempest-keypair-test-1585635935',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='957ef341e91c400e8ab292571f04481e',ramdisk_id='',reservation_id='r-ayz77tvg',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-OvnDvrAdvancedTest-144790097',owner_user_name='tempest-OvnDvrAdvancedTest-144790097-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:58:23Z,user_data=None,user_id='6a6aecbc792d4a2c89d40ccdc353a3a8',uuid=736d6d45-59ee-4524-929c-b8cd1d2195eb,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "db3aee07-dc82-48ec-880a-55bbfeb6371d", "address": "fa:16:3e:f8:26:39", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.215", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb3aee07-dc", "ovs_interfaceid": "db3aee07-dc82-48ec-880a-55bbfeb6371d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.450 183087 DEBUG nova.network.os_vif_util [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Converting VIF {"id": "db3aee07-dc82-48ec-880a-55bbfeb6371d", "address": "fa:16:3e:f8:26:39", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.215", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb3aee07-dc", "ovs_interfaceid": "db3aee07-dc82-48ec-880a-55bbfeb6371d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.451 183087 DEBUG nova.network.os_vif_util [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:26:39,bridge_name='br-int',has_traffic_filtering=True,id=db3aee07-dc82-48ec-880a-55bbfeb6371d,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb3aee07-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.451 183087 DEBUG os_vif [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:26:39,bridge_name='br-int',has_traffic_filtering=True,id=db3aee07-dc82-48ec-880a-55bbfeb6371d,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb3aee07-dc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.453 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.453 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb3aee07-dc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.453 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.457 183087 INFO os_vif [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:26:39,bridge_name='br-int',has_traffic_filtering=True,id=db3aee07-dc82-48ec-880a-55bbfeb6371d,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb3aee07-dc')
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.457 183087 DEBUG nova.compute.manager [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.457 183087 DEBUG nova.compute.manager [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 08:58:23 compute-1 nova_compute[183083]: 2026-01-26 08:58:23.458 183087 DEBUG nova.network.neutron [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 08:58:24 compute-1 nova_compute[183083]: 2026-01-26 08:58:24.333 183087 DEBUG nova.network.neutron [req-3f640a5b-d9bf-4d22-8da9-e5b8e09705de req-1140c23e-52b9-45db-8f9f-f508a9839616 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Updated VIF entry in instance network info cache for port db3aee07-dc82-48ec-880a-55bbfeb6371d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:58:24 compute-1 nova_compute[183083]: 2026-01-26 08:58:24.334 183087 DEBUG nova.network.neutron [req-3f640a5b-d9bf-4d22-8da9-e5b8e09705de req-1140c23e-52b9-45db-8f9f-f508a9839616 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Updating instance_info_cache with network_info: [{"id": "db3aee07-dc82-48ec-880a-55bbfeb6371d", "address": "fa:16:3e:f8:26:39", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.215", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb3aee07-dc", "ovs_interfaceid": "db3aee07-dc82-48ec-880a-55bbfeb6371d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:58:24 compute-1 nova_compute[183083]: 2026-01-26 08:58:24.339 183087 DEBUG nova.network.neutron [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:58:24 compute-1 nova_compute[183083]: 2026-01-26 08:58:24.421 183087 DEBUG oslo_concurrency.lockutils [req-3f640a5b-d9bf-4d22-8da9-e5b8e09705de req-1140c23e-52b9-45db-8f9f-f508a9839616 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-736d6d45-59ee-4524-929c-b8cd1d2195eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:58:24 compute-1 nova_compute[183083]: 2026-01-26 08:58:24.470 183087 INFO nova.compute.manager [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] [instance: 736d6d45-59ee-4524-929c-b8cd1d2195eb] Took 1.01 seconds to deallocate network for instance.
Jan 26 08:58:24 compute-1 nova_compute[183083]: 2026-01-26 08:58:24.690 183087 INFO nova.scheduler.client.report [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Deleted allocations for instance 736d6d45-59ee-4524-929c-b8cd1d2195eb
Jan 26 08:58:24 compute-1 nova_compute[183083]: 2026-01-26 08:58:24.690 183087 DEBUG oslo_concurrency.lockutils [None req-0e274de4-adef-44ab-84d2-92897d010983 6a6aecbc792d4a2c89d40ccdc353a3a8 957ef341e91c400e8ab292571f04481e - - default default] Lock "736d6d45-59ee-4524-929c-b8cd1d2195eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.894s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:58:25 compute-1 podman[219115]: 2026-01-26 08:58:25.806352767 +0000 UTC m=+0.064345185 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 08:58:26 compute-1 nova_compute[183083]: 2026-01-26 08:58:26.102 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:26 compute-1 nova_compute[183083]: 2026-01-26 08:58:26.492 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:31 compute-1 nova_compute[183083]: 2026-01-26 08:58:31.105 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:31 compute-1 nova_compute[183083]: 2026-01-26 08:58:31.492 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:36 compute-1 nova_compute[183083]: 2026-01-26 08:58:36.107 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:36 compute-1 nova_compute[183083]: 2026-01-26 08:58:36.494 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:58:36.703 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:58:36 compute-1 nova_compute[183083]: 2026-01-26 08:58:36.704 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:58:36.705 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:58:38 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:58:38.707 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:58:40 compute-1 podman[219140]: 2026-01-26 08:58:40.791942058 +0000 UTC m=+0.061138913 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 08:58:40 compute-1 podman[219139]: 2026-01-26 08:58:40.791899467 +0000 UTC m=+0.061952137 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 26 08:58:41 compute-1 nova_compute[183083]: 2026-01-26 08:58:41.110 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:41 compute-1 nova_compute[183083]: 2026-01-26 08:58:41.496 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:44 compute-1 podman[219182]: 2026-01-26 08:58:44.799992697 +0000 UTC m=+0.053929179 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 26 08:58:44 compute-1 podman[219181]: 2026-01-26 08:58:44.829211255 +0000 UTC m=+0.084212537 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 08:58:44 compute-1 podman[219183]: 2026-01-26 08:58:44.835189654 +0000 UTC m=+0.081144190 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 08:58:46 compute-1 nova_compute[183083]: 2026-01-26 08:58:46.112 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:46 compute-1 nova_compute[183083]: 2026-01-26 08:58:46.612 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:51 compute-1 nova_compute[183083]: 2026-01-26 08:58:51.115 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:51 compute-1 ovn_controller[95352]: 2026-01-26T08:58:51Z|00236|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 26 08:58:51 compute-1 nova_compute[183083]: 2026-01-26 08:58:51.613 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:52 compute-1 sshd-session[219247]: Accepted publickey for zuul from 38.102.83.66 port 43332 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:58:52 compute-1 systemd-logind[788]: New session 41 of user zuul.
Jan 26 08:58:52 compute-1 systemd[1]: Started Session 41 of User zuul.
Jan 26 08:58:52 compute-1 sshd-session[219247]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:58:52 compute-1 sshd-session[219250]: Connection closed by 38.102.83.66 port 43332
Jan 26 08:58:52 compute-1 sshd-session[219247]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:58:52 compute-1 systemd[1]: session-41.scope: Deactivated successfully.
Jan 26 08:58:52 compute-1 systemd-logind[788]: Session 41 logged out. Waiting for processes to exit.
Jan 26 08:58:52 compute-1 systemd-logind[788]: Removed session 41.
Jan 26 08:58:56 compute-1 nova_compute[183083]: 2026-01-26 08:58:56.119 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:56 compute-1 nova_compute[183083]: 2026-01-26 08:58:56.615 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:58:56 compute-1 podman[219275]: 2026-01-26 08:58:56.794091212 +0000 UTC m=+0.059174558 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 08:59:01 compute-1 nova_compute[183083]: 2026-01-26 08:59:01.122 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:01 compute-1 nova_compute[183083]: 2026-01-26 08:59:01.731 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:04 compute-1 ovn_controller[95352]: 2026-01-26T08:59:04Z|00237|pinctrl|WARN|Dropped 237 log messages in last 61 seconds (most recently, 2 seconds ago) due to excessive rate
Jan 26 08:59:04 compute-1 ovn_controller[95352]: 2026-01-26T08:59:04Z|00238|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 08:59:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:05.312 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:59:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:05.312 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:59:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:05.312 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:59:06 compute-1 nova_compute[183083]: 2026-01-26 08:59:06.125 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:06 compute-1 nova_compute[183083]: 2026-01-26 08:59:06.784 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:08 compute-1 nova_compute[183083]: 2026-01-26 08:59:08.480 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:59:08 compute-1 nova_compute[183083]: 2026-01-26 08:59:08.581 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:59:08 compute-1 nova_compute[183083]: 2026-01-26 08:59:08.581 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 08:59:08 compute-1 nova_compute[183083]: 2026-01-26 08:59:08.581 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 08:59:08 compute-1 nova_compute[183083]: 2026-01-26 08:59:08.651 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 08:59:10 compute-1 nova_compute[183083]: 2026-01-26 08:59:10.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:59:11 compute-1 nova_compute[183083]: 2026-01-26 08:59:11.129 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:11 compute-1 nova_compute[183083]: 2026-01-26 08:59:11.840 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:11 compute-1 podman[219302]: 2026-01-26 08:59:11.867951574 +0000 UTC m=+0.119958231 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, distribution-scope=public, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.expose-services=, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Jan 26 08:59:11 compute-1 podman[219301]: 2026-01-26 08:59:11.872280296 +0000 UTC m=+0.129879741 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Jan 26 08:59:11 compute-1 nova_compute[183083]: 2026-01-26 08:59:11.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:59:11 compute-1 nova_compute[183083]: 2026-01-26 08:59:11.953 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:59:12 compute-1 nova_compute[183083]: 2026-01-26 08:59:12.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:59:12 compute-1 nova_compute[183083]: 2026-01-26 08:59:12.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:59:13 compute-1 nova_compute[183083]: 2026-01-26 08:59:13.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:59:13 compute-1 nova_compute[183083]: 2026-01-26 08:59:13.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:59:13 compute-1 nova_compute[183083]: 2026-01-26 08:59:13.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 08:59:15 compute-1 podman[219345]: 2026-01-26 08:59:15.852171238 +0000 UTC m=+0.098295917 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 08:59:15 compute-1 podman[219344]: 2026-01-26 08:59:15.869369985 +0000 UTC m=+0.112686714 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:59:15 compute-1 podman[219343]: 2026-01-26 08:59:15.904977944 +0000 UTC m=+0.155685772 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 08:59:15 compute-1 nova_compute[183083]: 2026-01-26 08:59:15.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 08:59:16 compute-1 nova_compute[183083]: 2026-01-26 08:59:16.010 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:59:16 compute-1 nova_compute[183083]: 2026-01-26 08:59:16.010 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:59:16 compute-1 nova_compute[183083]: 2026-01-26 08:59:16.011 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:59:16 compute-1 nova_compute[183083]: 2026-01-26 08:59:16.011 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 08:59:16 compute-1 nova_compute[183083]: 2026-01-26 08:59:16.130 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:16 compute-1 nova_compute[183083]: 2026-01-26 08:59:16.281 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:59:16 compute-1 nova_compute[183083]: 2026-01-26 08:59:16.282 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13773MB free_disk=113.09379959106445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 08:59:16 compute-1 nova_compute[183083]: 2026-01-26 08:59:16.282 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:59:16 compute-1 nova_compute[183083]: 2026-01-26 08:59:16.282 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:59:16 compute-1 nova_compute[183083]: 2026-01-26 08:59:16.338 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 08:59:16 compute-1 nova_compute[183083]: 2026-01-26 08:59:16.338 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 08:59:16 compute-1 nova_compute[183083]: 2026-01-26 08:59:16.355 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:59:16 compute-1 nova_compute[183083]: 2026-01-26 08:59:16.366 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:59:16 compute-1 nova_compute[183083]: 2026-01-26 08:59:16.490 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 08:59:16 compute-1 nova_compute[183083]: 2026-01-26 08:59:16.491 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:59:16 compute-1 nova_compute[183083]: 2026-01-26 08:59:16.873 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:21 compute-1 nova_compute[183083]: 2026-01-26 08:59:21.134 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:21 compute-1 nova_compute[183083]: 2026-01-26 08:59:21.877 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:26 compute-1 nova_compute[183083]: 2026-01-26 08:59:26.179 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:26 compute-1 nova_compute[183083]: 2026-01-26 08:59:26.879 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:27 compute-1 podman[219410]: 2026-01-26 08:59:27.779641186 +0000 UTC m=+0.051150781 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 08:59:31 compute-1 nova_compute[183083]: 2026-01-26 08:59:31.183 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:31 compute-1 nova_compute[183083]: 2026-01-26 08:59:31.914 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:33 compute-1 nova_compute[183083]: 2026-01-26 08:59:33.838 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "a89a5221-3253-49f6-b902-67f973b0690e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:59:33 compute-1 nova_compute[183083]: 2026-01-26 08:59:33.839 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:59:33 compute-1 sshd-session[219434]: Invalid user solv from 2.57.122.238 port 34858
Jan 26 08:59:33 compute-1 nova_compute[183083]: 2026-01-26 08:59:33.899 183087 DEBUG nova.compute.manager [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 08:59:33 compute-1 sshd-session[219434]: Connection closed by invalid user solv 2.57.122.238 port 34858 [preauth]
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.093 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.093 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.106 183087 DEBUG nova.virt.hardware [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.106 183087 INFO nova.compute.claims [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Claim successful on node compute-1.ctlplane.example.com
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.318 183087 DEBUG nova.compute.provider_tree [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.339 183087 DEBUG nova.scheduler.client.report [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.361 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.363 183087 DEBUG nova.compute.manager [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.414 183087 DEBUG nova.compute.manager [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.415 183087 DEBUG nova.network.neutron [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.433 183087 INFO nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.465 183087 DEBUG nova.compute.manager [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.565 183087 DEBUG nova.compute.manager [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.567 183087 DEBUG nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.568 183087 INFO nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Creating image(s)
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.569 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "/var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.569 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "/var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.570 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "/var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.596 183087 DEBUG oslo_concurrency.processutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.665 183087 DEBUG oslo_concurrency.processutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.667 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.668 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.685 183087 DEBUG oslo_concurrency.processutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.768 183087 DEBUG oslo_concurrency.processutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.770 183087 DEBUG oslo_concurrency.processutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.813 183087 DEBUG oslo_concurrency.processutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.816 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.817 183087 DEBUG oslo_concurrency.processutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.905 183087 DEBUG oslo_concurrency.processutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.908 183087 DEBUG nova.virt.disk.api [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Checking if we can resize image /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.909 183087 DEBUG oslo_concurrency.processutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.994 183087 DEBUG oslo_concurrency.processutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.996 183087 DEBUG nova.virt.disk.api [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Cannot resize image /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 08:59:34 compute-1 nova_compute[183083]: 2026-01-26 08:59:34.996 183087 DEBUG nova.objects.instance [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lazy-loading 'migration_context' on Instance uuid a89a5221-3253-49f6-b902-67f973b0690e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:59:35 compute-1 nova_compute[183083]: 2026-01-26 08:59:35.014 183087 DEBUG nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 08:59:35 compute-1 nova_compute[183083]: 2026-01-26 08:59:35.015 183087 DEBUG nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Ensure instance console log exists: /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 08:59:35 compute-1 nova_compute[183083]: 2026-01-26 08:59:35.016 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:59:35 compute-1 nova_compute[183083]: 2026-01-26 08:59:35.016 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:59:35 compute-1 nova_compute[183083]: 2026-01-26 08:59:35.017 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:59:36 compute-1 nova_compute[183083]: 2026-01-26 08:59:36.187 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:36 compute-1 nova_compute[183083]: 2026-01-26 08:59:36.919 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:38 compute-1 nova_compute[183083]: 2026-01-26 08:59:38.676 183087 DEBUG nova.network.neutron [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Successfully created port: e235b615-3ab0-49d4-9c0d-a4d905192bd6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 08:59:39 compute-1 nova_compute[183083]: 2026-01-26 08:59:39.800 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:39 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:39.801 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:59:39 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:39.802 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 08:59:41 compute-1 nova_compute[183083]: 2026-01-26 08:59:41.189 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:41 compute-1 nova_compute[183083]: 2026-01-26 08:59:41.530 183087 DEBUG nova.network.neutron [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Successfully updated port: e235b615-3ab0-49d4-9c0d-a4d905192bd6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 08:59:41 compute-1 nova_compute[183083]: 2026-01-26 08:59:41.579 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:59:41 compute-1 nova_compute[183083]: 2026-01-26 08:59:41.579 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquired lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:59:41 compute-1 nova_compute[183083]: 2026-01-26 08:59:41.580 183087 DEBUG nova.network.neutron [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 08:59:41 compute-1 nova_compute[183083]: 2026-01-26 08:59:41.710 183087 DEBUG nova.compute.manager [req-9932fdb8-380c-4fc4-836c-489f2e22f831 req-16461247-c6f6-4802-8325-01f7d31a7cbe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received event network-changed-e235b615-3ab0-49d4-9c0d-a4d905192bd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:59:41 compute-1 nova_compute[183083]: 2026-01-26 08:59:41.710 183087 DEBUG nova.compute.manager [req-9932fdb8-380c-4fc4-836c-489f2e22f831 req-16461247-c6f6-4802-8325-01f7d31a7cbe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Refreshing instance network info cache due to event network-changed-e235b615-3ab0-49d4-9c0d-a4d905192bd6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 08:59:41 compute-1 nova_compute[183083]: 2026-01-26 08:59:41.711 183087 DEBUG oslo_concurrency.lockutils [req-9932fdb8-380c-4fc4-836c-489f2e22f831 req-16461247-c6f6-4802-8325-01f7d31a7cbe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 08:59:41 compute-1 nova_compute[183083]: 2026-01-26 08:59:41.820 183087 DEBUG nova.network.neutron [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 08:59:41 compute-1 nova_compute[183083]: 2026-01-26 08:59:41.922 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:42 compute-1 podman[219451]: 2026-01-26 08:59:42.815035198 +0000 UTC m=+0.069211434 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 26 08:59:42 compute-1 podman[219452]: 2026-01-26 08:59:42.823305926 +0000 UTC m=+0.076572806 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=openstack_network_exporter)
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.109 183087 DEBUG nova.network.neutron [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Updating instance_info_cache with network_info: [{"id": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "address": "fa:16:3e:f0:4e:56", "network": {"id": "902c250a-6b5f-40de-85f8-6172556f9918", "bridge": "br-int", "label": "tempest-test-network--2054667150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape235b615-3a", "ovs_interfaceid": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.136 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Releasing lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.137 183087 DEBUG nova.compute.manager [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Instance network_info: |[{"id": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "address": "fa:16:3e:f0:4e:56", "network": {"id": "902c250a-6b5f-40de-85f8-6172556f9918", "bridge": "br-int", "label": "tempest-test-network--2054667150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape235b615-3a", "ovs_interfaceid": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.138 183087 DEBUG oslo_concurrency.lockutils [req-9932fdb8-380c-4fc4-836c-489f2e22f831 req-16461247-c6f6-4802-8325-01f7d31a7cbe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.139 183087 DEBUG nova.network.neutron [req-9932fdb8-380c-4fc4-836c-489f2e22f831 req-16461247-c6f6-4802-8325-01f7d31a7cbe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Refreshing network info cache for port e235b615-3ab0-49d4-9c0d-a4d905192bd6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.143 183087 DEBUG nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Start _get_guest_xml network_info=[{"id": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "address": "fa:16:3e:f0:4e:56", "network": {"id": "902c250a-6b5f-40de-85f8-6172556f9918", "bridge": "br-int", "label": "tempest-test-network--2054667150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape235b615-3a", "ovs_interfaceid": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encryption_options': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.149 183087 WARNING nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.153 183087 DEBUG nova.virt.libvirt.host [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.154 183087 DEBUG nova.virt.libvirt.host [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.156 183087 DEBUG nova.virt.libvirt.host [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.157 183087 DEBUG nova.virt.libvirt.host [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.158 183087 DEBUG nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.158 183087 DEBUG nova.virt.hardware [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T08:43:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7017323-cd35-470d-a718-faa6c6e97277',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.159 183087 DEBUG nova.virt.hardware [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.159 183087 DEBUG nova.virt.hardware [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.159 183087 DEBUG nova.virt.hardware [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.160 183087 DEBUG nova.virt.hardware [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.160 183087 DEBUG nova.virt.hardware [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.160 183087 DEBUG nova.virt.hardware [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.161 183087 DEBUG nova.virt.hardware [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.161 183087 DEBUG nova.virt.hardware [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.161 183087 DEBUG nova.virt.hardware [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.161 183087 DEBUG nova.virt.hardware [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.169 183087 DEBUG nova.virt.libvirt.vif [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:59:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-server-test-78344759',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-78344759',id=46,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBCdzwlLhB6FbGto6B+9xUIGqDD4Nkmb9VT0Y4hVPwUFt+Pnvt+qUg7O8mk7/7/EcjP1qy2gKBsKzD7Rm3HgaxefrKiDsUNb9XIKWFAVeuw+MyxY4GcVzoBujypjmrmUqA==',key_name='tempest-keypair-1601993815',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2580bb16c90849c4b5919eb271774a06',ramdisk_id='',reservation_id='r-008vt5ta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDvrTest-691788706',owner_user_name='tempest-OvnDvrTest-691788706-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:59:34Z,user_data=None,user_id='90104736f4ab4d81b09d1ff11e40f454',uuid=a89a5221-3253-49f6-b902-67f973b0690e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "address": "fa:16:3e:f0:4e:56", "network": {"id": "902c250a-6b5f-40de-85f8-6172556f9918", "bridge": "br-int", "label": "tempest-test-network--2054667150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape235b615-3a", "ovs_interfaceid": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.170 183087 DEBUG nova.network.os_vif_util [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converting VIF {"id": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "address": "fa:16:3e:f0:4e:56", "network": {"id": "902c250a-6b5f-40de-85f8-6172556f9918", "bridge": "br-int", "label": "tempest-test-network--2054667150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape235b615-3a", "ovs_interfaceid": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.171 183087 DEBUG nova.network.os_vif_util [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:4e:56,bridge_name='br-int',has_traffic_filtering=True,id=e235b615-3ab0-49d4-9c0d-a4d905192bd6,network=Network(902c250a-6b5f-40de-85f8-6172556f9918),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape235b615-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.173 183087 DEBUG nova.objects.instance [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lazy-loading 'pci_devices' on Instance uuid a89a5221-3253-49f6-b902-67f973b0690e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.194 183087 DEBUG nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] End _get_guest_xml xml=<domain type="kvm">
Jan 26 08:59:43 compute-1 nova_compute[183083]:   <uuid>a89a5221-3253-49f6-b902-67f973b0690e</uuid>
Jan 26 08:59:43 compute-1 nova_compute[183083]:   <name>instance-0000002e</name>
Jan 26 08:59:43 compute-1 nova_compute[183083]:   <memory>131072</memory>
Jan 26 08:59:43 compute-1 nova_compute[183083]:   <vcpu>1</vcpu>
Jan 26 08:59:43 compute-1 nova_compute[183083]:   <metadata>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <nova:name>tempest-server-test-78344759</nova:name>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <nova:creationTime>2026-01-26 08:59:43</nova:creationTime>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <nova:flavor name="m1.nano">
Jan 26 08:59:43 compute-1 nova_compute[183083]:         <nova:memory>128</nova:memory>
Jan 26 08:59:43 compute-1 nova_compute[183083]:         <nova:disk>1</nova:disk>
Jan 26 08:59:43 compute-1 nova_compute[183083]:         <nova:swap>0</nova:swap>
Jan 26 08:59:43 compute-1 nova_compute[183083]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 08:59:43 compute-1 nova_compute[183083]:         <nova:vcpus>1</nova:vcpus>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       </nova:flavor>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <nova:owner>
Jan 26 08:59:43 compute-1 nova_compute[183083]:         <nova:user uuid="90104736f4ab4d81b09d1ff11e40f454">tempest-OvnDvrTest-691788706-project-admin</nova:user>
Jan 26 08:59:43 compute-1 nova_compute[183083]:         <nova:project uuid="2580bb16c90849c4b5919eb271774a06">tempest-OvnDvrTest-691788706</nova:project>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       </nova:owner>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <nova:root type="image" uuid="13d1a20a-8003-4f19-aba7-ccbd9eff9b82"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <nova:ports>
Jan 26 08:59:43 compute-1 nova_compute[183083]:         <nova:port uuid="e235b615-3ab0-49d4-9c0d-a4d905192bd6">
Jan 26 08:59:43 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       </nova:ports>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     </nova:instance>
Jan 26 08:59:43 compute-1 nova_compute[183083]:   </metadata>
Jan 26 08:59:43 compute-1 nova_compute[183083]:   <sysinfo type="smbios">
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <system>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <entry name="manufacturer">RDO</entry>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <entry name="product">OpenStack Compute</entry>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <entry name="serial">a89a5221-3253-49f6-b902-67f973b0690e</entry>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <entry name="uuid">a89a5221-3253-49f6-b902-67f973b0690e</entry>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <entry name="family">Virtual Machine</entry>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     </system>
Jan 26 08:59:43 compute-1 nova_compute[183083]:   </sysinfo>
Jan 26 08:59:43 compute-1 nova_compute[183083]:   <os>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <boot dev="hd"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <smbios mode="sysinfo"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:   </os>
Jan 26 08:59:43 compute-1 nova_compute[183083]:   <features>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <acpi/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <apic/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <vmcoreinfo/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:   </features>
Jan 26 08:59:43 compute-1 nova_compute[183083]:   <clock offset="utc">
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <timer name="hpet" present="no"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:   </clock>
Jan 26 08:59:43 compute-1 nova_compute[183083]:   <cpu mode="host-model" match="exact">
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:   </cpu>
Jan 26 08:59:43 compute-1 nova_compute[183083]:   <devices>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <disk type="file" device="disk">
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <target dev="vda" bus="virtio"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <disk type="file" device="cdrom">
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk.config"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <target dev="sda" bus="sata"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     </disk>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:f0:4e:56"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <target dev="tape235b615-3a"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     </interface>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <serial type="pty">
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <log file="/var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/console.log" append="off"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     </serial>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <video>
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     </video>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <input type="tablet" bus="usb"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <rng model="virtio">
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <backend model="random">/dev/urandom</backend>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     </rng>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <controller type="usb" index="0"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     <memballoon model="virtio">
Jan 26 08:59:43 compute-1 nova_compute[183083]:       <stats period="10"/>
Jan 26 08:59:43 compute-1 nova_compute[183083]:     </memballoon>
Jan 26 08:59:43 compute-1 nova_compute[183083]:   </devices>
Jan 26 08:59:43 compute-1 nova_compute[183083]: </domain>
Jan 26 08:59:43 compute-1 nova_compute[183083]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.196 183087 DEBUG nova.compute.manager [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Preparing to wait for external event network-vif-plugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.196 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "a89a5221-3253-49f6-b902-67f973b0690e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.196 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.196 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.197 183087 DEBUG nova.virt.libvirt.vif [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T08:59:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-server-test-78344759',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-78344759',id=46,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBCdzwlLhB6FbGto6B+9xUIGqDD4Nkmb9VT0Y4hVPwUFt+Pnvt+qUg7O8mk7/7/EcjP1qy2gKBsKzD7Rm3HgaxefrKiDsUNb9XIKWFAVeuw+MyxY4GcVzoBujypjmrmUqA==',key_name='tempest-keypair-1601993815',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2580bb16c90849c4b5919eb271774a06',ramdisk_id='',reservation_id='r-008vt5ta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDvrTest-691788706',owner_user_name='tempest-OvnDvrTest-691788706-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T08:59:34Z,user_data=None,user_id='90104736f4ab4d81b09d1ff11e40f454',uuid=a89a5221-3253-49f6-b902-67f973b0690e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "address": "fa:16:3e:f0:4e:56", "network": {"id": "902c250a-6b5f-40de-85f8-6172556f9918", "bridge": "br-int", "label": "tempest-test-network--2054667150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape235b615-3a", "ovs_interfaceid": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.198 183087 DEBUG nova.network.os_vif_util [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converting VIF {"id": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "address": "fa:16:3e:f0:4e:56", "network": {"id": "902c250a-6b5f-40de-85f8-6172556f9918", "bridge": "br-int", "label": "tempest-test-network--2054667150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape235b615-3a", "ovs_interfaceid": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.199 183087 DEBUG nova.network.os_vif_util [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:4e:56,bridge_name='br-int',has_traffic_filtering=True,id=e235b615-3ab0-49d4-9c0d-a4d905192bd6,network=Network(902c250a-6b5f-40de-85f8-6172556f9918),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape235b615-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.199 183087 DEBUG os_vif [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:4e:56,bridge_name='br-int',has_traffic_filtering=True,id=e235b615-3ab0-49d4-9c0d-a4d905192bd6,network=Network(902c250a-6b5f-40de-85f8-6172556f9918),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape235b615-3a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.200 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.200 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.201 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.204 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.204 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape235b615-3a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.205 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape235b615-3a, col_values=(('external_ids', {'iface-id': 'e235b615-3ab0-49d4-9c0d-a4d905192bd6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f0:4e:56', 'vm-uuid': 'a89a5221-3253-49f6-b902-67f973b0690e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.206 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:43 compute-1 NetworkManager[55451]: <info>  [1769417983.2077] manager: (tape235b615-3a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.209 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.214 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.215 183087 INFO os_vif [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:4e:56,bridge_name='br-int',has_traffic_filtering=True,id=e235b615-3ab0-49d4-9c0d-a4d905192bd6,network=Network(902c250a-6b5f-40de-85f8-6172556f9918),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape235b615-3a')
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.271 183087 DEBUG nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.271 183087 DEBUG nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.271 183087 DEBUG nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] No VIF found with MAC fa:16:3e:f0:4e:56, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.272 183087 INFO nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Using config drive
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.750 183087 INFO nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Creating config drive at /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk.config
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.759 183087 DEBUG oslo_concurrency.processutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwifbkyoh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 08:59:43 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:43.805 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.897 183087 DEBUG oslo_concurrency.processutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwifbkyoh" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 08:59:43 compute-1 kernel: tape235b615-3a: entered promiscuous mode
Jan 26 08:59:43 compute-1 ovn_controller[95352]: 2026-01-26T08:59:43Z|00239|binding|INFO|Claiming lport e235b615-3ab0-49d4-9c0d-a4d905192bd6 for this chassis.
Jan 26 08:59:43 compute-1 ovn_controller[95352]: 2026-01-26T08:59:43Z|00240|binding|INFO|e235b615-3ab0-49d4-9c0d-a4d905192bd6: Claiming fa:16:3e:f0:4e:56 10.100.0.5
Jan 26 08:59:43 compute-1 NetworkManager[55451]: <info>  [1769417983.9606] manager: (tape235b615-3a): new Tun device (/org/freedesktop/NetworkManager/Devices/85)
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.959 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.963 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:43 compute-1 nova_compute[183083]: 2026-01-26 08:59:43.969 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:43 compute-1 NetworkManager[55451]: <info>  [1769417983.9702] manager: (patch-br-int-to-provnet-149e76db-406a-40c9-b6a7-879b1da420de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Jan 26 08:59:43 compute-1 NetworkManager[55451]: <info>  [1769417983.9710] manager: (patch-provnet-149e76db-406a-40c9-b6a7-879b1da420de-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Jan 26 08:59:43 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:43.974 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:56 10.100.0.5'], port_security=['fa:16:3e:f0:4e:56 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-902c250a-6b5f-40de-85f8-6172556f9918', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2580bb16c90849c4b5919eb271774a06', 'neutron:revision_number': '2', 'neutron:security_group_ids': '00734c5e-2a15-43b1-a106-6b4708879098', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9161e928-a360-45a0-86d0-eb6f299d1fc7, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=e235b615-3ab0-49d4-9c0d-a4d905192bd6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 08:59:43 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:43.976 104632 INFO neutron.agent.ovn.metadata.agent [-] Port e235b615-3ab0-49d4-9c0d-a4d905192bd6 in datapath 902c250a-6b5f-40de-85f8-6172556f9918 bound to our chassis
Jan 26 08:59:43 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:43.978 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 902c250a-6b5f-40de-85f8-6172556f9918
Jan 26 08:59:43 compute-1 systemd-udevd[219508]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 08:59:43 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:43.992 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[2b4479a5-ec0e-4317-b248-fc378874f44e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:59:43 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:43.993 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap902c250a-61 in ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 08:59:43 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:43.995 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap902c250a-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 08:59:43 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:43.995 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[0997938c-3f39-489c-bea0-aade26863296]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:59:43 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:43.996 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[bef4ae10-cad4-46ae-af9d-aeac01ad94b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:59:44 compute-1 NetworkManager[55451]: <info>  [1769417984.0056] device (tape235b615-3a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 08:59:44 compute-1 NetworkManager[55451]: <info>  [1769417984.0062] device (tape235b615-3a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.009 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[baccc136-dce2-44d6-bdfb-d2213c5f5567]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:59:44 compute-1 systemd-machined[154360]: New machine qemu-14-instance-0000002e.
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.039 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[75d97539-8526-4745-9995-f52c16bf20eb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:59:44 compute-1 systemd[1]: Started Virtual Machine qemu-14-instance-0000002e.
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.048 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.058 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:44 compute-1 ovn_controller[95352]: 2026-01-26T08:59:44Z|00241|binding|INFO|Setting lport e235b615-3ab0-49d4-9c0d-a4d905192bd6 ovn-installed in OVS
Jan 26 08:59:44 compute-1 ovn_controller[95352]: 2026-01-26T08:59:44Z|00242|binding|INFO|Setting lport e235b615-3ab0-49d4-9c0d-a4d905192bd6 up in Southbound
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.068 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.074 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[0a74e5fe-bfc4-422e-8a82-c3cd5544a8d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:59:44 compute-1 NetworkManager[55451]: <info>  [1769417984.0812] manager: (tap902c250a-60): new Veth device (/org/freedesktop/NetworkManager/Devices/88)
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.081 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[26048c37-105c-452f-a7bd-dbc5b30d89fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.116 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[06a42934-f6d4-4682-93e5-9ee5fc84a82d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.120 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[442a6f2c-95f0-4252-964d-a9e9c50b58ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:59:44 compute-1 NetworkManager[55451]: <info>  [1769417984.1449] device (tap902c250a-60): carrier: link connected
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.152 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[d9391d90-0f56-4b63-9b7a-d80819ebd150]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.170 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[248f0b5a-9333-45ed-9551-2ea51daf22e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap902c250a-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:31:c2:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432475, 'reachable_time': 21903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219544, 'error': None, 'target': 'ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.191 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b24334a3-714f-460e-acd9-9bb6fdaec787]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe31:c2fc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 432475, 'tstamp': 432475}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219545, 'error': None, 'target': 'ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.206 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec197d8-92ee-40a7-8276-66f2b3cc63be]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap902c250a-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:31:c2:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432475, 'reachable_time': 21903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219546, 'error': None, 'target': 'ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.239 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[85769b6f-4769-4a71-9f55-3b35211f097f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.291 183087 DEBUG nova.compute.manager [req-b21cc417-2a2b-44c0-b06e-5263b8e82148 req-3e063746-7955-45a7-8b2a-6fd4925fe62a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received event network-vif-plugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.293 183087 DEBUG oslo_concurrency.lockutils [req-b21cc417-2a2b-44c0-b06e-5263b8e82148 req-3e063746-7955-45a7-8b2a-6fd4925fe62a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "a89a5221-3253-49f6-b902-67f973b0690e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.293 183087 DEBUG oslo_concurrency.lockutils [req-b21cc417-2a2b-44c0-b06e-5263b8e82148 req-3e063746-7955-45a7-8b2a-6fd4925fe62a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.294 183087 DEBUG oslo_concurrency.lockutils [req-b21cc417-2a2b-44c0-b06e-5263b8e82148 req-3e063746-7955-45a7-8b2a-6fd4925fe62a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.294 183087 DEBUG nova.compute.manager [req-b21cc417-2a2b-44c0-b06e-5263b8e82148 req-3e063746-7955-45a7-8b2a-6fd4925fe62a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Processing event network-vif-plugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.306 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[fc190231-5ab7-4f32-a6ea-4af71a293183]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.308 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap902c250a-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.308 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.308 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap902c250a-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:59:44 compute-1 NetworkManager[55451]: <info>  [1769417984.3107] manager: (tap902c250a-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Jan 26 08:59:44 compute-1 kernel: tap902c250a-60: entered promiscuous mode
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.311 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.313 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.313 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap902c250a-60, col_values=(('external_ids', {'iface-id': 'c9c76eb7-dd25-4862-b654-cdfd8369f343'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 08:59:44 compute-1 ovn_controller[95352]: 2026-01-26T08:59:44Z|00243|binding|INFO|Releasing lport c9c76eb7-dd25-4862-b654-cdfd8369f343 from this chassis (sb_readonly=0)
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.314 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.330 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.331 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/902c250a-6b5f-40de-85f8-6172556f9918.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/902c250a-6b5f-40de-85f8-6172556f9918.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.332 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[a9127f7f-347f-4a1a-9ed0-784bfddb39a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.333 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: global
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-902c250a-6b5f-40de-85f8-6172556f9918
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/902c250a-6b5f-40de-85f8-6172556f9918.pid.haproxy
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID 902c250a-6b5f-40de-85f8-6172556f9918
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 08:59:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 08:59:44.334 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918', 'env', 'PROCESS_TAG=haproxy-902c250a-6b5f-40de-85f8-6172556f9918', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/902c250a-6b5f-40de-85f8-6172556f9918.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.343 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417984.3425741, a89a5221-3253-49f6-b902-67f973b0690e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.343 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] VM Started (Lifecycle Event)
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.345 183087 DEBUG nova.compute.manager [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.349 183087 DEBUG nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.352 183087 INFO nova.virt.libvirt.driver [-] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Instance spawned successfully.
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.352 183087 DEBUG nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.369 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.373 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.380 183087 DEBUG nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.381 183087 DEBUG nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.381 183087 DEBUG nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.382 183087 DEBUG nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.382 183087 DEBUG nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.383 183087 DEBUG nova.virt.libvirt.driver [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.393 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.394 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417984.3427618, a89a5221-3253-49f6-b902-67f973b0690e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.394 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] VM Paused (Lifecycle Event)
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.421 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.424 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769417984.348989, a89a5221-3253-49f6-b902-67f973b0690e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.425 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] VM Resumed (Lifecycle Event)
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.453 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.457 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.461 183087 INFO nova.compute.manager [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Took 9.90 seconds to spawn the instance on the hypervisor.
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.461 183087 DEBUG nova.compute.manager [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.491 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.525 183087 INFO nova.compute.manager [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Took 10.47 seconds to build instance.
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.564 183087 DEBUG oslo_concurrency.lockutils [None req-651d4f7b-6f72-4c89-bd73-3a857cdac44d 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.726s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:59:44 compute-1 podman[219585]: 2026-01-26 08:59:44.687489606 +0000 UTC m=+0.054841696 container create 8c852493079d11537f357c41059924a79a196dc85708484d10d03ca04a7e3dc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.710 183087 DEBUG nova.network.neutron [req-9932fdb8-380c-4fc4-836c-489f2e22f831 req-16461247-c6f6-4802-8325-01f7d31a7cbe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Updated VIF entry in instance network info cache for port e235b615-3ab0-49d4-9c0d-a4d905192bd6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.711 183087 DEBUG nova.network.neutron [req-9932fdb8-380c-4fc4-836c-489f2e22f831 req-16461247-c6f6-4802-8325-01f7d31a7cbe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Updating instance_info_cache with network_info: [{"id": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "address": "fa:16:3e:f0:4e:56", "network": {"id": "902c250a-6b5f-40de-85f8-6172556f9918", "bridge": "br-int", "label": "tempest-test-network--2054667150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape235b615-3a", "ovs_interfaceid": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 08:59:44 compute-1 systemd[1]: Started libpod-conmon-8c852493079d11537f357c41059924a79a196dc85708484d10d03ca04a7e3dc2.scope.
Jan 26 08:59:44 compute-1 nova_compute[183083]: 2026-01-26 08:59:44.726 183087 DEBUG oslo_concurrency.lockutils [req-9932fdb8-380c-4fc4-836c-489f2e22f831 req-16461247-c6f6-4802-8325-01f7d31a7cbe 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 08:59:44 compute-1 systemd[1]: Started libcrun container.
Jan 26 08:59:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/537d7e84c802a95c5dcb83dc10b46eb6604f5a4a5d2cbd29571c00a498d7a7fc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 08:59:44 compute-1 podman[219585]: 2026-01-26 08:59:44.656349655 +0000 UTC m=+0.023701765 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 08:59:44 compute-1 podman[219585]: 2026-01-26 08:59:44.771049312 +0000 UTC m=+0.138401422 container init 8c852493079d11537f357c41059924a79a196dc85708484d10d03ca04a7e3dc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 08:59:44 compute-1 podman[219585]: 2026-01-26 08:59:44.777006132 +0000 UTC m=+0.144358212 container start 8c852493079d11537f357c41059924a79a196dc85708484d10d03ca04a7e3dc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 08:59:44 compute-1 neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918[219600]: [NOTICE]   (219604) : New worker (219606) forked
Jan 26 08:59:44 compute-1 neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918[219600]: [NOTICE]   (219604) : Loading success.
Jan 26 08:59:46 compute-1 nova_compute[183083]: 2026-01-26 08:59:46.397 183087 DEBUG nova.compute.manager [req-20a942a7-26de-4c83-9a3b-23c986365508 req-40897f8a-6366-4b5b-a5ab-5fa1c4c83bb9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received event network-vif-plugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 08:59:46 compute-1 nova_compute[183083]: 2026-01-26 08:59:46.398 183087 DEBUG oslo_concurrency.lockutils [req-20a942a7-26de-4c83-9a3b-23c986365508 req-40897f8a-6366-4b5b-a5ab-5fa1c4c83bb9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "a89a5221-3253-49f6-b902-67f973b0690e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 08:59:46 compute-1 nova_compute[183083]: 2026-01-26 08:59:46.399 183087 DEBUG oslo_concurrency.lockutils [req-20a942a7-26de-4c83-9a3b-23c986365508 req-40897f8a-6366-4b5b-a5ab-5fa1c4c83bb9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 08:59:46 compute-1 nova_compute[183083]: 2026-01-26 08:59:46.400 183087 DEBUG oslo_concurrency.lockutils [req-20a942a7-26de-4c83-9a3b-23c986365508 req-40897f8a-6366-4b5b-a5ab-5fa1c4c83bb9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 08:59:46 compute-1 nova_compute[183083]: 2026-01-26 08:59:46.400 183087 DEBUG nova.compute.manager [req-20a942a7-26de-4c83-9a3b-23c986365508 req-40897f8a-6366-4b5b-a5ab-5fa1c4c83bb9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] No waiting events found dispatching network-vif-plugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 08:59:46 compute-1 nova_compute[183083]: 2026-01-26 08:59:46.401 183087 WARNING nova.compute.manager [req-20a942a7-26de-4c83-9a3b-23c986365508 req-40897f8a-6366-4b5b-a5ab-5fa1c4c83bb9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received unexpected event network-vif-plugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 for instance with vm_state active and task_state None.
Jan 26 08:59:46 compute-1 podman[219618]: 2026-01-26 08:59:46.811319081 +0000 UTC m=+0.066589412 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 08:59:46 compute-1 podman[219617]: 2026-01-26 08:59:46.821692721 +0000 UTC m=+0.082803898 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 08:59:46 compute-1 podman[219616]: 2026-01-26 08:59:46.839066117 +0000 UTC m=+0.101740713 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 08:59:46 compute-1 nova_compute[183083]: 2026-01-26 08:59:46.968 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:48 compute-1 nova_compute[183083]: 2026-01-26 08:59:48.253 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:48 compute-1 sshd-session[219683]: Accepted publickey for zuul from 38.102.83.66 port 59518 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:59:48 compute-1 systemd-logind[788]: New session 42 of user zuul.
Jan 26 08:59:48 compute-1 systemd[1]: Started Session 42 of User zuul.
Jan 26 08:59:48 compute-1 sshd-session[219683]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:59:48 compute-1 sshd-session[219687]: Accepted publickey for zuul from 38.102.83.66 port 59522 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 08:59:48 compute-1 systemd-logind[788]: New session 43 of user zuul.
Jan 26 08:59:48 compute-1 systemd[1]: Started Session 43 of User zuul.
Jan 26 08:59:48 compute-1 sshd-session[219687]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 08:59:49 compute-1 sudo[219691]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 08:59:49 compute-1 sudo[219691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:59:49 compute-1 sudo[219691]: pam_unix(sudo:session): session closed for user root
Jan 26 08:59:49 compute-1 sshd-session[219690]: Connection closed by 38.102.83.66 port 59522
Jan 26 08:59:49 compute-1 sudo[219716]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni eth1 icmp and ether host fa:16:3e:51:7c:13 -w /tmp/tmp.CpSDd7oLQ5
Jan 26 08:59:49 compute-1 sshd-session[219687]: pam_unix(sshd:session): session closed for user zuul
Jan 26 08:59:49 compute-1 sudo[219716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 08:59:49 compute-1 systemd[1]: session-43.scope: Deactivated successfully.
Jan 26 08:59:49 compute-1 systemd-logind[788]: Session 43 logged out. Waiting for processes to exit.
Jan 26 08:59:49 compute-1 systemd-logind[788]: Removed session 43.
Jan 26 08:59:52 compute-1 nova_compute[183083]: 2026-01-26 08:59:52.043 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:53 compute-1 nova_compute[183083]: 2026-01-26 08:59:53.257 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:56 compute-1 ovn_controller[95352]: 2026-01-26T08:59:56Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f0:4e:56 10.100.0.5
Jan 26 08:59:56 compute-1 ovn_controller[95352]: 2026-01-26T08:59:56Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f0:4e:56 10.100.0.5
Jan 26 08:59:57 compute-1 nova_compute[183083]: 2026-01-26 08:59:57.046 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:58 compute-1 nova_compute[183083]: 2026-01-26 08:59:58.261 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 08:59:58 compute-1 podman[219755]: 2026-01-26 08:59:58.818480376 +0000 UTC m=+0.072572181 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:00:02 compute-1 nova_compute[183083]: 2026-01-26 09:00:02.049 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:03 compute-1 nova_compute[183083]: 2026-01-26 09:00:03.264 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.747 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'name': 'tempest-server-test-78344759', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002e', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2580bb16c90849c4b5919eb271774a06', 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'hostId': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.748 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.781 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/cpu volume: 10440000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69f2c6f0-ab60-4e8f-a6a1-b46fc3eacf58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10440000000, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'timestamp': '2026-01-26T09:00:03.749242', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '67406354-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.443329154, 'message_signature': '182466cbb5b00ea3e36fcf9346ce773c7b04311b23968d01968e82d8fa6be1f0'}]}, 'timestamp': '2026-01-26 09:00:03.783646', '_unique_id': '3ed0b1292fbb4cd68e68b5704a9889ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.789 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.791 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.794 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a89a5221-3253-49f6-b902-67f973b0690e / tape235b615-3a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.794 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '284c2f31-8194-46c4-a4fa-9ccd200a9337', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:00:03.791245', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': '67423436-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.453290394, 'message_signature': 'b3b1f9fe16cd3013b74394fc35398130788a80a846f7891e8a86960d903187f6'}]}, 'timestamp': '2026-01-26 09:00:03.795125', '_unique_id': '575d15a8fecc41db9631f096830b8394'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.796 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.797 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.837 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.read.requests volume: 1105 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.837 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '173d23a9-25fa-41a8-9d44-1191d05bcca4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1105, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-vda', 'timestamp': '2026-01-26T09:00:03.797551', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6748bdd8-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.459649334, 'message_signature': 'cf97746ac75ade94f8a804cfafb6ce33c982feed01c5f99645240a44d590e5fc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-sda', 'timestamp': '2026-01-26T09:00:03.797551', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6748cc1a-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.459649334, 'message_signature': '5f0caabd2b11c8e72d0a8f1843a918e0df5f36be3c9a7d895b852ca861292579'}]}, 'timestamp': '2026-01-26 09:00:03.838232', '_unique_id': 'fd3482a85354447b80fa4fd6ceb688e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.839 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.840 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.853 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.853 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a18719a7-53c4-4272-a041-80f272d7c0d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-vda', 'timestamp': '2026-01-26T09:00:03.840398', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '674b279e-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.502413236, 'message_signature': 'e0933683ae545d620f3f8e505c176b35d3edf5332a00b1b31365092f6cfcd9b7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-sda', 'timestamp': '2026-01-26T09:00:03.840398', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '674b34fa-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.502413236, 'message_signature': 'c004363f429239b1c308341a7c897d07850eb4756becd1faea5537d52fad1c84'}]}, 'timestamp': '2026-01-26 09:00:03.853992', '_unique_id': '03e8617b39064d74a64e4cab5191337b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.855 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.857 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.857 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.write.requests volume: 303 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.858 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff40107c-aa3a-41ab-b2cd-fc7bd9a4a366', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 303, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-vda', 'timestamp': '2026-01-26T09:00:03.857601', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '674bd450-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.459649334, 'message_signature': 'd4b3e689f636f81601967e197645a92e5e22058424f94e0c02de3f7e9ae74b2e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-sda', 'timestamp': '2026-01-26T09:00:03.857601', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '674be54e-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.459649334, 'message_signature': '3da8655515661bb89d1fd18368a4e8a4c42cbed4cd7d0abdb0f6290b5528b3ae'}]}, 'timestamp': '2026-01-26 09:00:03.858587', '_unique_id': '2571a03c3bd94c47b9b3afbcc2fc9a69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.860 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.outgoing.bytes volume: 1480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8da1bfd8-6975-40fe-bcf5-d579736a1725', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1480, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:00:03.860858', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': '674c4d86-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.453290394, 'message_signature': 'c6bd69b66695470a7656202a801006e3a6d750bdb78169a7d70b32470689bb85'}]}, 'timestamp': '2026-01-26 09:00:03.861182', '_unique_id': 'e132f1bb90f3425686908a4803a1f377'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.861 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.862 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.862 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d02e509-bc61-44dc-80f0-d3752119e345', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:00:03.862481', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': '674c8c9c-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.453290394, 'message_signature': '887181078ddada1984f7da8cbfc6615423e2e4b3b93255dbd57340762ac7be81'}]}, 'timestamp': '2026-01-26 09:00:03.862771', '_unique_id': '0f582dd2839b419a801830c0a24ad283'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.863 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.outgoing.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be709614-feda-4327-8010-053912fcad65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:00:03.863954', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': '674cc540-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.453290394, 'message_signature': '0c9d91d8ffe0be0a37a694e259d2100d5dd419f3b594b71ce1c83c38e85604a4'}]}, 'timestamp': '2026-01-26 09:00:03.864251', '_unique_id': 'ca31c7d522a145bb86b7e187876c92df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.864 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.865 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.865 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.write.bytes volume: 72904704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.865 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22a86ce7-7745-49bc-81c5-a374655a2134', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72904704, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-vda', 'timestamp': '2026-01-26T09:00:03.865653', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '674d08a2-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.459649334, 'message_signature': '7a912417f78793a76305bf7b73a5f040bc0a1b7d644bb8e51babfd801867c3f8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-sda', 'timestamp': '2026-01-26T09:00:03.865653', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '674d1464-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.459649334, 'message_signature': '216e0cfc7eb66ed87e74153cab4a89c41f0cb85de3a2755fe5ab0b42ea8a6667'}]}, 'timestamp': '2026-01-26 09:00:03.866221', '_unique_id': '38e47db448a842b8a2fe5a7ddbe57126'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.866 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.867 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.867 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff89ae55-39a8-4a90-ba21-05b6c653210f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:00:03.867502', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': '674d4f24-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.453290394, 'message_signature': 'b951cdb57203b4a430ae99bf00847a664f056811a7f15b0a8ae1a1d48f830f25'}]}, 'timestamp': '2026-01-26 09:00:03.867748', '_unique_id': 'cc782b4c325d497f99acf35504b706fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.868 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.869 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.write.latency volume: 2651582562 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.869 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f782b951-0571-4103-9bd2-b9b9e04db5ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2651582562, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-vda', 'timestamp': '2026-01-26T09:00:03.869041', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '674d8ce6-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.459649334, 'message_signature': '4405d30d9a4c0673c54c3bd499807610d818efaadb2a5a632d7b86c1fcc0ecc9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-sda', 'timestamp': '2026-01-26T09:00:03.869041', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '674d9524-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.459649334, 'message_signature': '52df4c7f9f8fa48d0f87eb9b7de59de1235fca5e975bf9a98651f826de63e565'}]}, 'timestamp': '2026-01-26 09:00:03.869516', '_unique_id': 'f7c2a097af6e4adf8461605ff83acce1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.read.latency volume: 223797246 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.870 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.read.latency volume: 28287898 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e11d8fd-8308-4756-8cf9-fab790a5fa13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 223797246, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-vda', 'timestamp': '2026-01-26T09:00:03.870767', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '674dce72-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.459649334, 'message_signature': '2221cd474bd3b2908799d0aa57bb1e87cad037a1f98e70cf53217c1081ea872d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28287898, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-sda', 'timestamp': '2026-01-26T09:00:03.870767', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '674dd73c-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.459649334, 'message_signature': '2461446d40e5dc3e56621c9f3a5db4e66f6af89db0abc8f80aa7b19adc8eef90'}]}, 'timestamp': '2026-01-26 09:00:03.871205', '_unique_id': 'ec5a6f76eda84008851c147960de7447'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.871 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.872 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.872 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.read.bytes volume: 30525952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.872 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd810c34b-eed2-470c-a8d1-2110f9e6faae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30525952, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-vda', 'timestamp': '2026-01-26T09:00:03.872333', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '674e0b80-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.459649334, 'message_signature': 'b3ab7ff962a3b434a3bb2e0133b2aff4d91c80cdc6acd6757b5f5ea9264e416a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-sda', 'timestamp': '2026-01-26T09:00:03.872333', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '674e1332-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.459649334, 'message_signature': '2985c341f096b8fda9258382468a2e775ad086ed596c9db7a0ee0b82697bfccc'}]}, 'timestamp': '2026-01-26 09:00:03.872749', '_unique_id': 'bc58b40a1de840acbf7bc7d7ebdd9d88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.873 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e8eebeb-f10a-4855-a81d-7ba1982740d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:00:03.873847', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': '674e46c2-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.453290394, 'message_signature': 'e9e23d35d748289f8862d004182adb186c4007312c2b3d5165a495d804228a02'}]}, 'timestamp': '2026-01-26 09:00:03.874092', '_unique_id': '7f60159e37fa43a4ad3dfb253c6f3d7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.874 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.875 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.875 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.875 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-78344759>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-78344759>]
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.875 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.875 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.875 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7bee196-2778-486a-87f8-3f523ad77ee9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-vda', 'timestamp': '2026-01-26T09:00:03.875608', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '674e8bdc-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.502413236, 'message_signature': '70d9aec51c0040f28337017cae6e3c5b1a6645bd939746953a14ad4bb08602dc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-sda', 'timestamp': '2026-01-26T09:00:03.875608', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '674e9406-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.502413236, 'message_signature': 'b748bef8b31cf48e0bc165c0df6fc58edaa0f6d8c3768be6a8835f8a5db1ca8a'}]}, 'timestamp': '2026-01-26 09:00:03.876038', '_unique_id': '43207fdbe8c245a39b0d18e88e0f7288'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.876 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.877 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.877 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.877 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-78344759>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-78344759>]
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.877 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.877 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.877 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56dff72e-d5fd-4824-ab3f-6c2c1edccb3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-vda', 'timestamp': '2026-01-26T09:00:03.877471', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '674ed40c-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.502413236, 'message_signature': '4be9370eb42867544980f760d54863fa30992ae847869577dd62068d7d40494c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-sda', 'timestamp': '2026-01-26T09:00:03.877471', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '674edc54-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.502413236, 'message_signature': 'b8f18fab8f06fc813c8dbed340d062a30ad33e824230d324d1319f3e48c72b16'}]}, 'timestamp': '2026-01-26 09:00:03.877888', '_unique_id': '50b241413d494723b0a4a3e41cb4669b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.878 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '522a94df-605d-449e-877e-d91d31045bba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:00:03.879006', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': '674f111a-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.453290394, 'message_signature': '354a29816a82901f78b42374238b6c7d61c801e4a668924aae66aeb1ae02d4ee'}]}, 'timestamp': '2026-01-26 09:00:03.879254', '_unique_id': '74e3fc7e4cf04885bbd0e99cc93e0700'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.879 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.880 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.880 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.880 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-78344759>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-78344759>]
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.880 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.880 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/memory.usage volume: 46.5703125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14f06419-3807-4056-ab91-24c617b0f2fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.5703125, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'timestamp': '2026-01-26T09:00:03.880619', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '674f4fd6-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.443329154, 'message_signature': '2ad4645479f651d20cfa7b0a63b539559f9d5699bff22a1700667695e19bad69'}]}, 'timestamp': '2026-01-26 09:00:03.880854', '_unique_id': '621bf5b8d5f049fd9fd12868117115fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.881 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3087b318-e160-43d4-a6ae-0af36691e42f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:00:03.881955', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': '674f84ce-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.453290394, 'message_signature': 'dee69c10f7b7d9fd3e7fbff0f57a7285df91f982ebd189714fb46cd595e96a13'}]}, 'timestamp': '2026-01-26 09:00:03.882216', '_unique_id': '5eb1fa3b2d8c4a578d119c1bfbb7c511'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.882 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.883 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.883 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.883 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-78344759>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-78344759>]
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.883 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.883 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.incoming.bytes volume: 1558 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aab524b6-5574-447e-aac4-fe7762756282', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1558, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:00:03.883561', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': '674fc1e6-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.453290394, 'message_signature': 'a61d1b8f30cb7b1f999c43063312045e96ea8d226f3a3b0c7e980d9a8f01c662'}]}, 'timestamp': '2026-01-26 09:00:03.883779', '_unique_id': '5b02a32ce22947ac87a1a39f8390b070'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.884 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3c8946b-0e32-4d23-acad-635703e0e253', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:00:03.884820', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': '674ff33c-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4344.453290394, 'message_signature': 'c2f4e04b285e3d6980324952519a4411082e9181ed2fe911707a840c64e1a143'}]}, 'timestamp': '2026-01-26 09:00:03.885041', '_unique_id': '510026481a664eacafbc14dcefbaa19b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:00:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:00:03.885 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:00:04 compute-1 ovn_controller[95352]: 2026-01-26T09:00:04Z|00244|pinctrl|WARN|Dropped 337 log messages in last 60 seconds (most recently, 6 seconds ago) due to excessive rate
Jan 26 09:00:04 compute-1 ovn_controller[95352]: 2026-01-26T09:00:04Z|00245|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:00:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:00:05.314 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:00:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:00:05.315 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:00:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:00:05.316 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:00:07 compute-1 nova_compute[183083]: 2026-01-26 09:00:07.052 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:08 compute-1 nova_compute[183083]: 2026-01-26 09:00:08.267 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:09 compute-1 nova_compute[183083]: 2026-01-26 09:00:09.492 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:00:09 compute-1 nova_compute[183083]: 2026-01-26 09:00:09.493 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:00:09 compute-1 nova_compute[183083]: 2026-01-26 09:00:09.494 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:00:09 compute-1 nova_compute[183083]: 2026-01-26 09:00:09.701 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:00:09 compute-1 nova_compute[183083]: 2026-01-26 09:00:09.702 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquired lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:00:09 compute-1 nova_compute[183083]: 2026-01-26 09:00:09.702 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 09:00:09 compute-1 nova_compute[183083]: 2026-01-26 09:00:09.702 183087 DEBUG nova.objects.instance [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lazy-loading 'info_cache' on Instance uuid a89a5221-3253-49f6-b902-67f973b0690e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:00:10 compute-1 nova_compute[183083]: 2026-01-26 09:00:10.972 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Updating instance_info_cache with network_info: [{"id": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "address": "fa:16:3e:f0:4e:56", "network": {"id": "902c250a-6b5f-40de-85f8-6172556f9918", "bridge": "br-int", "label": "tempest-test-network--2054667150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape235b615-3a", "ovs_interfaceid": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:00:10 compute-1 nova_compute[183083]: 2026-01-26 09:00:10.998 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Releasing lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:00:10 compute-1 nova_compute[183083]: 2026-01-26 09:00:10.998 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 09:00:11 compute-1 nova_compute[183083]: 2026-01-26 09:00:11.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:00:11 compute-1 nova_compute[183083]: 2026-01-26 09:00:11.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:00:12 compute-1 nova_compute[183083]: 2026-01-26 09:00:12.071 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:13 compute-1 sshd-session[219780]: Accepted publickey for zuul from 38.102.83.66 port 37816 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:00:13 compute-1 systemd-logind[788]: New session 44 of user zuul.
Jan 26 09:00:13 compute-1 systemd[1]: Started Session 44 of User zuul.
Jan 26 09:00:13 compute-1 sshd-session[219780]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:00:13 compute-1 podman[219782]: 2026-01-26 09:00:13.137637884 +0000 UTC m=+0.053469342 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:00:13 compute-1 podman[219784]: 2026-01-26 09:00:13.151118962 +0000 UTC m=+0.060435017 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 26 09:00:13 compute-1 sudo[219823]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.CpSDd7oLQ5
Jan 26 09:00:13 compute-1 sudo[219823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:00:13 compute-1 sudo[219823]: pam_unix(sudo:session): session closed for user root
Jan 26 09:00:13 compute-1 nova_compute[183083]: 2026-01-26 09:00:13.273 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:13 compute-1 nova_compute[183083]: 2026-01-26 09:00:13.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:00:13 compute-1 nova_compute[183083]: 2026-01-26 09:00:13.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:00:14 compute-1 ovn_controller[95352]: 2026-01-26T09:00:14Z|00246|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Jan 26 09:00:14 compute-1 nova_compute[183083]: 2026-01-26 09:00:14.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:00:14 compute-1 nova_compute[183083]: 2026-01-26 09:00:14.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:00:15 compute-1 nova_compute[183083]: 2026-01-26 09:00:15.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:00:15 compute-1 nova_compute[183083]: 2026-01-26 09:00:15.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:00:16 compute-1 nova_compute[183083]: 2026-01-26 09:00:16.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:00:16 compute-1 nova_compute[183083]: 2026-01-26 09:00:16.978 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:00:16 compute-1 nova_compute[183083]: 2026-01-26 09:00:16.979 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:00:16 compute-1 nova_compute[183083]: 2026-01-26 09:00:16.979 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:00:16 compute-1 nova_compute[183083]: 2026-01-26 09:00:16.980 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.058 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.076 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:17 compute-1 podman[219854]: 2026-01-26 09:00:17.101961865 +0000 UTC m=+0.060938139 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.118 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.119 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:00:17 compute-1 podman[219853]: 2026-01-26 09:00:17.131573998 +0000 UTC m=+0.090188993 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:00:17 compute-1 podman[219855]: 2026-01-26 09:00:17.155095288 +0000 UTC m=+0.100068631 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.173 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.323 183087 DEBUG nova.compute.manager [req-c665d66d-0e8e-4140-9cdf-81fb463ad14c req-158353f3-ed6d-4df3-8b37-e26a4f709593 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received event network-changed-e235b615-3ab0-49d4-9c0d-a4d905192bd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.324 183087 DEBUG nova.compute.manager [req-c665d66d-0e8e-4140-9cdf-81fb463ad14c req-158353f3-ed6d-4df3-8b37-e26a4f709593 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Refreshing instance network info cache due to event network-changed-e235b615-3ab0-49d4-9c0d-a4d905192bd6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.324 183087 DEBUG oslo_concurrency.lockutils [req-c665d66d-0e8e-4140-9cdf-81fb463ad14c req-158353f3-ed6d-4df3-8b37-e26a4f709593 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.325 183087 DEBUG oslo_concurrency.lockutils [req-c665d66d-0e8e-4140-9cdf-81fb463ad14c req-158353f3-ed6d-4df3-8b37-e26a4f709593 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.326 183087 DEBUG nova.network.neutron [req-c665d66d-0e8e-4140-9cdf-81fb463ad14c req-158353f3-ed6d-4df3-8b37-e26a4f709593 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Refreshing network info cache for port e235b615-3ab0-49d4-9c0d-a4d905192bd6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.358 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.359 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13596MB free_disk=113.06503295898438GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.360 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.360 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.439 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance a89a5221-3253-49f6-b902-67f973b0690e actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.439 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.440 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.459 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing inventories for resource provider 5203935e-446c-4e03-93fa-4c60d651e045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.487 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating ProviderTree inventory for provider 5203935e-446c-4e03-93fa-4c60d651e045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.488 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating inventory in ProviderTree for provider 5203935e-446c-4e03-93fa-4c60d651e045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.507 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing aggregate associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.526 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing trait associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.569 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.599 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.627 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:00:17 compute-1 nova_compute[183083]: 2026-01-26 09:00:17.628 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:00:18 compute-1 nova_compute[183083]: 2026-01-26 09:00:18.277 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:18 compute-1 nova_compute[183083]: 2026-01-26 09:00:18.920 183087 DEBUG nova.network.neutron [req-c665d66d-0e8e-4140-9cdf-81fb463ad14c req-158353f3-ed6d-4df3-8b37-e26a4f709593 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Updated VIF entry in instance network info cache for port e235b615-3ab0-49d4-9c0d-a4d905192bd6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:00:18 compute-1 nova_compute[183083]: 2026-01-26 09:00:18.921 183087 DEBUG nova.network.neutron [req-c665d66d-0e8e-4140-9cdf-81fb463ad14c req-158353f3-ed6d-4df3-8b37-e26a4f709593 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Updating instance_info_cache with network_info: [{"id": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "address": "fa:16:3e:f0:4e:56", "network": {"id": "902c250a-6b5f-40de-85f8-6172556f9918", "bridge": "br-int", "label": "tempest-test-network--2054667150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape235b615-3a", "ovs_interfaceid": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:00:18 compute-1 nova_compute[183083]: 2026-01-26 09:00:18.942 183087 DEBUG oslo_concurrency.lockutils [req-c665d66d-0e8e-4140-9cdf-81fb463ad14c req-158353f3-ed6d-4df3-8b37-e26a4f709593 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:00:22 compute-1 nova_compute[183083]: 2026-01-26 09:00:22.077 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:23 compute-1 sshd-session[219925]: Accepted publickey for zuul from 38.102.83.66 port 42392 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:00:23 compute-1 systemd-logind[788]: New session 45 of user zuul.
Jan 26 09:00:23 compute-1 nova_compute[183083]: 2026-01-26 09:00:23.280 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:23 compute-1 systemd[1]: Started Session 45 of User zuul.
Jan 26 09:00:23 compute-1 sshd-session[219925]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:00:23 compute-1 sshd-session[219929]: Accepted publickey for zuul from 38.102.83.66 port 42408 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:00:23 compute-1 systemd-logind[788]: New session 46 of user zuul.
Jan 26 09:00:23 compute-1 systemd[1]: Started Session 46 of User zuul.
Jan 26 09:00:23 compute-1 sshd-session[219929]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:00:23 compute-1 sudo[219933]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:00:23 compute-1 sudo[219933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:00:23 compute-1 sudo[219933]: pam_unix(sudo:session): session closed for user root
Jan 26 09:00:23 compute-1 sudo[219958]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni eth1 icmp and ether host fa:16:3e:dd:3b:46 -w /tmp/tmp.2f1GEFjLA7
Jan 26 09:00:23 compute-1 sudo[219958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:00:23 compute-1 sshd-session[219932]: Connection closed by 38.102.83.66 port 42408
Jan 26 09:00:23 compute-1 sshd-session[219929]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:00:23 compute-1 systemd[1]: session-46.scope: Deactivated successfully.
Jan 26 09:00:23 compute-1 systemd-logind[788]: Session 46 logged out. Waiting for processes to exit.
Jan 26 09:00:23 compute-1 systemd-logind[788]: Removed session 46.
Jan 26 09:00:27 compute-1 nova_compute[183083]: 2026-01-26 09:00:27.079 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:28 compute-1 nova_compute[183083]: 2026-01-26 09:00:28.283 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:29 compute-1 podman[219984]: 2026-01-26 09:00:29.830006403 +0000 UTC m=+0.090703546 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:00:32 compute-1 nova_compute[183083]: 2026-01-26 09:00:32.081 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:32 compute-1 sshd-session[220009]: Accepted publickey for zuul from 38.102.83.66 port 42482 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:00:32 compute-1 systemd-logind[788]: New session 47 of user zuul.
Jan 26 09:00:32 compute-1 systemd[1]: Started Session 47 of User zuul.
Jan 26 09:00:32 compute-1 sshd-session[220009]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:00:32 compute-1 sudo[220013]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.2f1GEFjLA7
Jan 26 09:00:32 compute-1 sudo[220013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:00:32 compute-1 sudo[220013]: pam_unix(sudo:session): session closed for user root
Jan 26 09:00:33 compute-1 nova_compute[183083]: 2026-01-26 09:00:33.286 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:33 compute-1 nova_compute[183083]: 2026-01-26 09:00:33.997 183087 DEBUG nova.compute.manager [req-99ff5ba8-4e3b-4d97-bf90-94cf62d79d25 req-009ac2d0-ca94-4508-b105-0bb872bc9620 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received event network-changed-e235b615-3ab0-49d4-9c0d-a4d905192bd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:00:33 compute-1 nova_compute[183083]: 2026-01-26 09:00:33.998 183087 DEBUG nova.compute.manager [req-99ff5ba8-4e3b-4d97-bf90-94cf62d79d25 req-009ac2d0-ca94-4508-b105-0bb872bc9620 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Refreshing instance network info cache due to event network-changed-e235b615-3ab0-49d4-9c0d-a4d905192bd6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:00:33 compute-1 nova_compute[183083]: 2026-01-26 09:00:33.998 183087 DEBUG oslo_concurrency.lockutils [req-99ff5ba8-4e3b-4d97-bf90-94cf62d79d25 req-009ac2d0-ca94-4508-b105-0bb872bc9620 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:00:33 compute-1 nova_compute[183083]: 2026-01-26 09:00:33.998 183087 DEBUG oslo_concurrency.lockutils [req-99ff5ba8-4e3b-4d97-bf90-94cf62d79d25 req-009ac2d0-ca94-4508-b105-0bb872bc9620 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:00:33 compute-1 nova_compute[183083]: 2026-01-26 09:00:33.999 183087 DEBUG nova.network.neutron [req-99ff5ba8-4e3b-4d97-bf90-94cf62d79d25 req-009ac2d0-ca94-4508-b105-0bb872bc9620 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Refreshing network info cache for port e235b615-3ab0-49d4-9c0d-a4d905192bd6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:00:35 compute-1 nova_compute[183083]: 2026-01-26 09:00:35.357 183087 DEBUG nova.network.neutron [req-99ff5ba8-4e3b-4d97-bf90-94cf62d79d25 req-009ac2d0-ca94-4508-b105-0bb872bc9620 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Updated VIF entry in instance network info cache for port e235b615-3ab0-49d4-9c0d-a4d905192bd6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:00:35 compute-1 nova_compute[183083]: 2026-01-26 09:00:35.358 183087 DEBUG nova.network.neutron [req-99ff5ba8-4e3b-4d97-bf90-94cf62d79d25 req-009ac2d0-ca94-4508-b105-0bb872bc9620 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Updating instance_info_cache with network_info: [{"id": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "address": "fa:16:3e:f0:4e:56", "network": {"id": "902c250a-6b5f-40de-85f8-6172556f9918", "bridge": "br-int", "label": "tempest-test-network--2054667150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape235b615-3a", "ovs_interfaceid": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:00:35 compute-1 nova_compute[183083]: 2026-01-26 09:00:35.386 183087 DEBUG oslo_concurrency.lockutils [req-99ff5ba8-4e3b-4d97-bf90-94cf62d79d25 req-009ac2d0-ca94-4508-b105-0bb872bc9620 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:00:37 compute-1 nova_compute[183083]: 2026-01-26 09:00:37.084 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:38 compute-1 nova_compute[183083]: 2026-01-26 09:00:38.288 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:39 compute-1 sshd-session[220039]: Accepted publickey for zuul from 38.102.83.66 port 40004 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:00:39 compute-1 systemd-logind[788]: New session 48 of user zuul.
Jan 26 09:00:39 compute-1 systemd[1]: Started Session 48 of User zuul.
Jan 26 09:00:39 compute-1 sshd-session[220039]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:00:39 compute-1 sshd-session[220043]: Accepted publickey for zuul from 38.102.83.66 port 40020 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:00:39 compute-1 systemd-logind[788]: New session 49 of user zuul.
Jan 26 09:00:39 compute-1 systemd[1]: Started Session 49 of User zuul.
Jan 26 09:00:39 compute-1 sshd-session[220043]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:00:39 compute-1 sudo[220047]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:00:39 compute-1 sudo[220047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:00:39 compute-1 sudo[220047]: pam_unix(sudo:session): session closed for user root
Jan 26 09:00:39 compute-1 sudo[220072]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni eth1 icmp and ether host fa:16:3e:51:7c:13 -w /tmp/tmp.X9ASNz5jXN
Jan 26 09:00:39 compute-1 sudo[220072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:00:39 compute-1 sshd-session[220046]: Connection closed by 38.102.83.66 port 40020
Jan 26 09:00:39 compute-1 sshd-session[220043]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:00:39 compute-1 systemd[1]: session-49.scope: Deactivated successfully.
Jan 26 09:00:39 compute-1 systemd-logind[788]: Session 49 logged out. Waiting for processes to exit.
Jan 26 09:00:39 compute-1 systemd-logind[788]: Removed session 49.
Jan 26 09:00:42 compute-1 nova_compute[183083]: 2026-01-26 09:00:42.137 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:43 compute-1 nova_compute[183083]: 2026-01-26 09:00:43.292 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:43 compute-1 podman[220098]: 2026-01-26 09:00:43.807144946 +0000 UTC m=+0.070247633 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:00:43 compute-1 podman[220099]: 2026-01-26 09:00:43.812211973 +0000 UTC m=+0.072457188 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Jan 26 09:00:47 compute-1 nova_compute[183083]: 2026-01-26 09:00:47.517 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:47 compute-1 podman[220145]: 2026-01-26 09:00:47.80072571 +0000 UTC m=+0.053946124 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 09:00:47 compute-1 podman[220139]: 2026-01-26 09:00:47.826391744 +0000 UTC m=+0.077994787 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 09:00:47 compute-1 podman[220138]: 2026-01-26 09:00:47.832942198 +0000 UTC m=+0.099242920 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:00:48 compute-1 nova_compute[183083]: 2026-01-26 09:00:48.295 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:48 compute-1 sshd-session[220205]: Accepted publickey for zuul from 38.102.83.66 port 51180 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:00:48 compute-1 systemd-logind[788]: New session 50 of user zuul.
Jan 26 09:00:48 compute-1 systemd[1]: Started Session 50 of User zuul.
Jan 26 09:00:48 compute-1 sshd-session[220205]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:00:49 compute-1 sudo[220209]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.X9ASNz5jXN
Jan 26 09:00:49 compute-1 sudo[220209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:00:49 compute-1 sudo[220209]: pam_unix(sudo:session): session closed for user root
Jan 26 09:00:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:00:51.695 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:00:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:00:51.696 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:00:51 compute-1 nova_compute[183083]: 2026-01-26 09:00:51.696 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:52 compute-1 nova_compute[183083]: 2026-01-26 09:00:52.518 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:53 compute-1 nova_compute[183083]: 2026-01-26 09:00:53.242 183087 DEBUG nova.compute.manager [req-17c62a50-d2c5-4a63-a20a-b32ee9d9aaf5 req-7974333e-a68b-49db-bf8f-08e4246965a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received event network-changed-e235b615-3ab0-49d4-9c0d-a4d905192bd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:00:53 compute-1 nova_compute[183083]: 2026-01-26 09:00:53.242 183087 DEBUG nova.compute.manager [req-17c62a50-d2c5-4a63-a20a-b32ee9d9aaf5 req-7974333e-a68b-49db-bf8f-08e4246965a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Refreshing instance network info cache due to event network-changed-e235b615-3ab0-49d4-9c0d-a4d905192bd6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:00:53 compute-1 nova_compute[183083]: 2026-01-26 09:00:53.242 183087 DEBUG oslo_concurrency.lockutils [req-17c62a50-d2c5-4a63-a20a-b32ee9d9aaf5 req-7974333e-a68b-49db-bf8f-08e4246965a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:00:53 compute-1 nova_compute[183083]: 2026-01-26 09:00:53.243 183087 DEBUG oslo_concurrency.lockutils [req-17c62a50-d2c5-4a63-a20a-b32ee9d9aaf5 req-7974333e-a68b-49db-bf8f-08e4246965a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:00:53 compute-1 nova_compute[183083]: 2026-01-26 09:00:53.243 183087 DEBUG nova.network.neutron [req-17c62a50-d2c5-4a63-a20a-b32ee9d9aaf5 req-7974333e-a68b-49db-bf8f-08e4246965a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Refreshing network info cache for port e235b615-3ab0-49d4-9c0d-a4d905192bd6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:00:53 compute-1 nova_compute[183083]: 2026-01-26 09:00:53.297 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:54 compute-1 nova_compute[183083]: 2026-01-26 09:00:54.760 183087 DEBUG nova.network.neutron [req-17c62a50-d2c5-4a63-a20a-b32ee9d9aaf5 req-7974333e-a68b-49db-bf8f-08e4246965a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Updated VIF entry in instance network info cache for port e235b615-3ab0-49d4-9c0d-a4d905192bd6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:00:54 compute-1 nova_compute[183083]: 2026-01-26 09:00:54.761 183087 DEBUG nova.network.neutron [req-17c62a50-d2c5-4a63-a20a-b32ee9d9aaf5 req-7974333e-a68b-49db-bf8f-08e4246965a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Updating instance_info_cache with network_info: [{"id": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "address": "fa:16:3e:f0:4e:56", "network": {"id": "902c250a-6b5f-40de-85f8-6172556f9918", "bridge": "br-int", "label": "tempest-test-network--2054667150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape235b615-3a", "ovs_interfaceid": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:00:54 compute-1 nova_compute[183083]: 2026-01-26 09:00:54.788 183087 DEBUG oslo_concurrency.lockutils [req-17c62a50-d2c5-4a63-a20a-b32ee9d9aaf5 req-7974333e-a68b-49db-bf8f-08e4246965a5 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:00:57 compute-1 nova_compute[183083]: 2026-01-26 09:00:57.522 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:58 compute-1 nova_compute[183083]: 2026-01-26 09:00:58.300 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:00:59 compute-1 sshd-session[220236]: Accepted publickey for zuul from 38.102.83.66 port 47044 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:00:59 compute-1 systemd-logind[788]: New session 51 of user zuul.
Jan 26 09:00:59 compute-1 systemd[1]: Started Session 51 of User zuul.
Jan 26 09:00:59 compute-1 sshd-session[220236]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:00:59 compute-1 sshd-session[220240]: Accepted publickey for zuul from 38.102.83.66 port 47048 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:00:59 compute-1 systemd-logind[788]: New session 52 of user zuul.
Jan 26 09:00:59 compute-1 systemd[1]: Started Session 52 of User zuul.
Jan 26 09:00:59 compute-1 sshd-session[220240]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:00:59 compute-1 sudo[220244]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:00:59 compute-1 sudo[220244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:00:59 compute-1 sudo[220244]: pam_unix(sudo:session): session closed for user root
Jan 26 09:00:59 compute-1 sudo[220269]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni eth1 icmp and ether host fa:16:3e:a8:07:55 -w /tmp/tmp.JD3Z5xdJwe
Jan 26 09:00:59 compute-1 sudo[220269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:00:59 compute-1 sshd-session[220243]: Connection closed by 38.102.83.66 port 47048
Jan 26 09:00:59 compute-1 sshd-session[220240]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:00:59 compute-1 systemd[1]: session-52.scope: Deactivated successfully.
Jan 26 09:00:59 compute-1 systemd-logind[788]: Session 52 logged out. Waiting for processes to exit.
Jan 26 09:00:59 compute-1 systemd-logind[788]: Removed session 52.
Jan 26 09:01:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:00.698 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:01:00 compute-1 podman[220295]: 2026-01-26 09:01:00.835776568 +0000 UTC m=+0.091299591 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 09:01:01 compute-1 CROND[220320]: (root) CMD (run-parts /etc/cron.hourly)
Jan 26 09:01:01 compute-1 run-parts[220323]: (/etc/cron.hourly) starting 0anacron
Jan 26 09:01:01 compute-1 run-parts[220329]: (/etc/cron.hourly) finished 0anacron
Jan 26 09:01:01 compute-1 CROND[220319]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 26 09:01:02 compute-1 nova_compute[183083]: 2026-01-26 09:01:02.523 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:03 compute-1 nova_compute[183083]: 2026-01-26 09:01:03.303 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:05.315 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:01:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:05.316 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:01:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:05.316 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:01:07 compute-1 nova_compute[183083]: 2026-01-26 09:01:07.526 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:08 compute-1 nova_compute[183083]: 2026-01-26 09:01:08.306 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:08 compute-1 sshd-session[220330]: Accepted publickey for zuul from 38.102.83.66 port 57988 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:01:08 compute-1 systemd-logind[788]: New session 53 of user zuul.
Jan 26 09:01:08 compute-1 systemd[1]: Started Session 53 of User zuul.
Jan 26 09:01:08 compute-1 sshd-session[220330]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:01:08 compute-1 sudo[220334]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.JD3Z5xdJwe
Jan 26 09:01:08 compute-1 sudo[220334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:01:08 compute-1 sudo[220334]: pam_unix(sudo:session): session closed for user root
Jan 26 09:01:10 compute-1 nova_compute[183083]: 2026-01-26 09:01:10.054 183087 DEBUG oslo_concurrency.lockutils [None req-db76064d-3c47-4b09-b528-975b4b54cdb0 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "a89a5221-3253-49f6-b902-67f973b0690e" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:01:10 compute-1 nova_compute[183083]: 2026-01-26 09:01:10.057 183087 DEBUG oslo_concurrency.lockutils [None req-db76064d-3c47-4b09-b528-975b4b54cdb0 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:01:10 compute-1 nova_compute[183083]: 2026-01-26 09:01:10.058 183087 INFO nova.compute.manager [None req-db76064d-3c47-4b09-b528-975b4b54cdb0 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Rebooting instance
Jan 26 09:01:10 compute-1 nova_compute[183083]: 2026-01-26 09:01:10.071 183087 DEBUG oslo_concurrency.lockutils [None req-db76064d-3c47-4b09-b528-975b4b54cdb0 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:01:10 compute-1 nova_compute[183083]: 2026-01-26 09:01:10.071 183087 DEBUG oslo_concurrency.lockutils [None req-db76064d-3c47-4b09-b528-975b4b54cdb0 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquired lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:01:10 compute-1 nova_compute[183083]: 2026-01-26 09:01:10.072 183087 DEBUG nova.network.neutron [None req-db76064d-3c47-4b09-b528-975b4b54cdb0 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 09:01:10 compute-1 nova_compute[183083]: 2026-01-26 09:01:10.625 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:01:10 compute-1 nova_compute[183083]: 2026-01-26 09:01:10.646 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:01:10 compute-1 nova_compute[183083]: 2026-01-26 09:01:10.647 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:01:10 compute-1 nova_compute[183083]: 2026-01-26 09:01:10.647 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:01:10 compute-1 nova_compute[183083]: 2026-01-26 09:01:10.665 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:01:12 compute-1 nova_compute[183083]: 2026-01-26 09:01:12.528 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:12 compute-1 nova_compute[183083]: 2026-01-26 09:01:12.667 183087 DEBUG nova.network.neutron [None req-db76064d-3c47-4b09-b528-975b4b54cdb0 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Updating instance_info_cache with network_info: [{"id": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "address": "fa:16:3e:f0:4e:56", "network": {"id": "902c250a-6b5f-40de-85f8-6172556f9918", "bridge": "br-int", "label": "tempest-test-network--2054667150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape235b615-3a", "ovs_interfaceid": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:01:12 compute-1 nova_compute[183083]: 2026-01-26 09:01:12.707 183087 DEBUG oslo_concurrency.lockutils [None req-db76064d-3c47-4b09-b528-975b4b54cdb0 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Releasing lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:01:12 compute-1 nova_compute[183083]: 2026-01-26 09:01:12.709 183087 DEBUG nova.compute.manager [None req-db76064d-3c47-4b09-b528-975b4b54cdb0 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:01:12 compute-1 nova_compute[183083]: 2026-01-26 09:01:12.710 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquired lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:01:12 compute-1 nova_compute[183083]: 2026-01-26 09:01:12.710 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 09:01:12 compute-1 nova_compute[183083]: 2026-01-26 09:01:12.710 183087 DEBUG nova.objects.instance [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lazy-loading 'info_cache' on Instance uuid a89a5221-3253-49f6-b902-67f973b0690e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:01:13 compute-1 nova_compute[183083]: 2026-01-26 09:01:13.381 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:14 compute-1 nova_compute[183083]: 2026-01-26 09:01:14.660 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Updating instance_info_cache with network_info: [{"id": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "address": "fa:16:3e:f0:4e:56", "network": {"id": "902c250a-6b5f-40de-85f8-6172556f9918", "bridge": "br-int", "label": "tempest-test-network--2054667150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape235b615-3a", "ovs_interfaceid": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:01:14 compute-1 nova_compute[183083]: 2026-01-26 09:01:14.724 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Releasing lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:01:14 compute-1 nova_compute[183083]: 2026-01-26 09:01:14.725 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 09:01:14 compute-1 nova_compute[183083]: 2026-01-26 09:01:14.725 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:01:14 compute-1 nova_compute[183083]: 2026-01-26 09:01:14.725 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:01:14 compute-1 podman[220364]: 2026-01-26 09:01:14.779090852 +0000 UTC m=+0.082789661 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, build-date=2025-08-20T13:12:41, version=9.6, release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git)
Jan 26 09:01:14 compute-1 podman[220363]: 2026-01-26 09:01:14.796364411 +0000 UTC m=+0.103983731 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 26 09:01:14 compute-1 nova_compute[183083]: 2026-01-26 09:01:14.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:01:14 compute-1 nova_compute[183083]: 2026-01-26 09:01:14.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:01:15 compute-1 kernel: tape235b615-3a (unregistering): left promiscuous mode
Jan 26 09:01:15 compute-1 NetworkManager[55451]: <info>  [1769418075.1095] device (tape235b615-3a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 09:01:15 compute-1 ovn_controller[95352]: 2026-01-26T09:01:15Z|00247|binding|INFO|Releasing lport e235b615-3ab0-49d4-9c0d-a4d905192bd6 from this chassis (sb_readonly=0)
Jan 26 09:01:15 compute-1 ovn_controller[95352]: 2026-01-26T09:01:15Z|00248|binding|INFO|Setting lport e235b615-3ab0-49d4-9c0d-a4d905192bd6 down in Southbound
Jan 26 09:01:15 compute-1 ovn_controller[95352]: 2026-01-26T09:01:15Z|00249|pinctrl|WARN|Dropped 173 log messages in last 70 seconds (most recently, 14 seconds ago) due to excessive rate
Jan 26 09:01:15 compute-1 nova_compute[183083]: 2026-01-26 09:01:15.119 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:15 compute-1 ovn_controller[95352]: 2026-01-26T09:01:15Z|00250|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:01:15 compute-1 ovn_controller[95352]: 2026-01-26T09:01:15Z|00251|binding|INFO|Removing iface tape235b615-3a ovn-installed in OVS
Jan 26 09:01:15 compute-1 nova_compute[183083]: 2026-01-26 09:01:15.121 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:15 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:15.126 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:56 10.100.0.5'], port_security=['fa:16:3e:f0:4e:56 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-902c250a-6b5f-40de-85f8-6172556f9918', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2580bb16c90849c4b5919eb271774a06', 'neutron:revision_number': '4', 'neutron:security_group_ids': '00734c5e-2a15-43b1-a106-6b4708879098', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.184'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9161e928-a360-45a0-86d0-eb6f299d1fc7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=e235b615-3ab0-49d4-9c0d-a4d905192bd6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:01:15 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:15.128 104632 INFO neutron.agent.ovn.metadata.agent [-] Port e235b615-3ab0-49d4-9c0d-a4d905192bd6 in datapath 902c250a-6b5f-40de-85f8-6172556f9918 unbound from our chassis
Jan 26 09:01:15 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:15.130 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 902c250a-6b5f-40de-85f8-6172556f9918, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 09:01:15 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:15.131 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[93bb830f-63db-4ddd-9763-2c9fd688ca3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:15 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:15.132 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918 namespace which is not needed anymore
Jan 26 09:01:15 compute-1 nova_compute[183083]: 2026-01-26 09:01:15.142 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:15 compute-1 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000002e.scope: Deactivated successfully.
Jan 26 09:01:15 compute-1 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000002e.scope: Consumed 15.756s CPU time.
Jan 26 09:01:15 compute-1 systemd-machined[154360]: Machine qemu-14-instance-0000002e terminated.
Jan 26 09:01:15 compute-1 nova_compute[183083]: 2026-01-26 09:01:15.334 183087 DEBUG nova.compute.manager [req-c089de14-b104-48fd-b799-9546a0989958 req-b461a57c-e39b-4ee9-a588-bb03723e4510 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received event network-vif-unplugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:01:15 compute-1 nova_compute[183083]: 2026-01-26 09:01:15.334 183087 DEBUG oslo_concurrency.lockutils [req-c089de14-b104-48fd-b799-9546a0989958 req-b461a57c-e39b-4ee9-a588-bb03723e4510 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "a89a5221-3253-49f6-b902-67f973b0690e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:01:15 compute-1 nova_compute[183083]: 2026-01-26 09:01:15.335 183087 DEBUG oslo_concurrency.lockutils [req-c089de14-b104-48fd-b799-9546a0989958 req-b461a57c-e39b-4ee9-a588-bb03723e4510 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:01:15 compute-1 nova_compute[183083]: 2026-01-26 09:01:15.335 183087 DEBUG oslo_concurrency.lockutils [req-c089de14-b104-48fd-b799-9546a0989958 req-b461a57c-e39b-4ee9-a588-bb03723e4510 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:01:15 compute-1 nova_compute[183083]: 2026-01-26 09:01:15.336 183087 DEBUG nova.compute.manager [req-c089de14-b104-48fd-b799-9546a0989958 req-b461a57c-e39b-4ee9-a588-bb03723e4510 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] No waiting events found dispatching network-vif-unplugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:01:15 compute-1 nova_compute[183083]: 2026-01-26 09:01:15.336 183087 WARNING nova.compute.manager [req-c089de14-b104-48fd-b799-9546a0989958 req-b461a57c-e39b-4ee9-a588-bb03723e4510 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received unexpected event network-vif-unplugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 for instance with vm_state active and task_state reboot_started.
Jan 26 09:01:15 compute-1 neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918[219600]: [NOTICE]   (219604) : haproxy version is 2.8.14-c23fe91
Jan 26 09:01:15 compute-1 neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918[219600]: [NOTICE]   (219604) : path to executable is /usr/sbin/haproxy
Jan 26 09:01:15 compute-1 neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918[219600]: [WARNING]  (219604) : Exiting Master process...
Jan 26 09:01:15 compute-1 neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918[219600]: [WARNING]  (219604) : Exiting Master process...
Jan 26 09:01:15 compute-1 neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918[219600]: [ALERT]    (219604) : Current worker (219606) exited with code 143 (Terminated)
Jan 26 09:01:15 compute-1 neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918[219600]: [WARNING]  (219604) : All workers exited. Exiting... (0)
Jan 26 09:01:15 compute-1 systemd[1]: libpod-8c852493079d11537f357c41059924a79a196dc85708484d10d03ca04a7e3dc2.scope: Deactivated successfully.
Jan 26 09:01:15 compute-1 podman[220428]: 2026-01-26 09:01:15.430444723 +0000 UTC m=+0.179098254 container died 8c852493079d11537f357c41059924a79a196dc85708484d10d03ca04a7e3dc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:01:15 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8c852493079d11537f357c41059924a79a196dc85708484d10d03ca04a7e3dc2-userdata-shm.mount: Deactivated successfully.
Jan 26 09:01:15 compute-1 systemd[1]: var-lib-containers-storage-overlay-537d7e84c802a95c5dcb83dc10b46eb6604f5a4a5d2cbd29571c00a498d7a7fc-merged.mount: Deactivated successfully.
Jan 26 09:01:15 compute-1 podman[220428]: 2026-01-26 09:01:15.740563968 +0000 UTC m=+0.489217499 container cleanup 8c852493079d11537f357c41059924a79a196dc85708484d10d03ca04a7e3dc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 09:01:15 compute-1 systemd[1]: libpod-conmon-8c852493079d11537f357c41059924a79a196dc85708484d10d03ca04a7e3dc2.scope: Deactivated successfully.
Jan 26 09:01:15 compute-1 podman[220477]: 2026-01-26 09:01:15.837477447 +0000 UTC m=+0.063706661 container remove 8c852493079d11537f357c41059924a79a196dc85708484d10d03ca04a7e3dc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 09:01:15 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:15.843 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[89acbf01-def8-4e51-9a9d-150e009e0d0b]: (4, ('Mon Jan 26 09:01:15 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918 (8c852493079d11537f357c41059924a79a196dc85708484d10d03ca04a7e3dc2)\n8c852493079d11537f357c41059924a79a196dc85708484d10d03ca04a7e3dc2\nMon Jan 26 09:01:15 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918 (8c852493079d11537f357c41059924a79a196dc85708484d10d03ca04a7e3dc2)\n8c852493079d11537f357c41059924a79a196dc85708484d10d03ca04a7e3dc2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:15 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:15.846 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[a18189e1-2f1b-4652-b5e5-31396d453edf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:15 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:15.847 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap902c250a-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:01:15 compute-1 nova_compute[183083]: 2026-01-26 09:01:15.886 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:15 compute-1 kernel: tap902c250a-60: left promiscuous mode
Jan 26 09:01:15 compute-1 nova_compute[183083]: 2026-01-26 09:01:15.898 183087 INFO nova.virt.libvirt.driver [None req-db76064d-3c47-4b09-b528-975b4b54cdb0 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Instance shutdown successfully.
Jan 26 09:01:15 compute-1 nova_compute[183083]: 2026-01-26 09:01:15.916 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:15 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:15.922 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[f9fb96ef-d5bd-4797-b4cb-e0e11fa1528f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:15 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:15.936 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[62f97e62-85b7-433e-854c-c296dca7c674]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:15 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:15.939 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[6e8f23f2-8922-4a77-b1dd-8ba356cbeb98]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:15 compute-1 nova_compute[183083]: 2026-01-26 09:01:15.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:01:15 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:15.960 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[df9808d9-73cb-44d5-8860-7222253ad249]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432467, 'reachable_time': 29698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220501, 'error': None, 'target': 'ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:15 compute-1 systemd[1]: run-netns-ovnmeta\x2d902c250a\x2d6b5f\x2d40de\x2d85f8\x2d6172556f9918.mount: Deactivated successfully.
Jan 26 09:01:15 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:15.964 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 09:01:15 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:15.964 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[71722bb9-c139-4904-9381-39fa9357a071]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:15 compute-1 kernel: tape235b615-3a: entered promiscuous mode
Jan 26 09:01:15 compute-1 systemd-udevd[220408]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 09:01:15 compute-1 ovn_controller[95352]: 2026-01-26T09:01:15Z|00252|binding|INFO|Claiming lport e235b615-3ab0-49d4-9c0d-a4d905192bd6 for this chassis.
Jan 26 09:01:15 compute-1 ovn_controller[95352]: 2026-01-26T09:01:15Z|00253|binding|INFO|e235b615-3ab0-49d4-9c0d-a4d905192bd6: Claiming fa:16:3e:f0:4e:56 10.100.0.5
Jan 26 09:01:15 compute-1 nova_compute[183083]: 2026-01-26 09:01:15.989 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:15 compute-1 NetworkManager[55451]: <info>  [1769418075.9905] manager: (tape235b615-3a): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Jan 26 09:01:16 compute-1 NetworkManager[55451]: <info>  [1769418076.0095] device (tape235b615-3a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 09:01:16 compute-1 NetworkManager[55451]: <info>  [1769418076.0110] device (tape235b615-3a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 09:01:16 compute-1 ovn_controller[95352]: 2026-01-26T09:01:16Z|00254|binding|INFO|Setting lport e235b615-3ab0-49d4-9c0d-a4d905192bd6 ovn-installed in OVS
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.014 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.017 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:16 compute-1 ovn_controller[95352]: 2026-01-26T09:01:16Z|00255|binding|INFO|Setting lport e235b615-3ab0-49d4-9c0d-a4d905192bd6 up in Southbound
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.025 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:56 10.100.0.5'], port_security=['fa:16:3e:f0:4e:56 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-902c250a-6b5f-40de-85f8-6172556f9918', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2580bb16c90849c4b5919eb271774a06', 'neutron:revision_number': '5', 'neutron:security_group_ids': '00734c5e-2a15-43b1-a106-6b4708879098', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.184'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9161e928-a360-45a0-86d0-eb6f299d1fc7, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=e235b615-3ab0-49d4-9c0d-a4d905192bd6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.026 104632 INFO neutron.agent.ovn.metadata.agent [-] Port e235b615-3ab0-49d4-9c0d-a4d905192bd6 in datapath 902c250a-6b5f-40de-85f8-6172556f9918 bound to our chassis
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.027 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 902c250a-6b5f-40de-85f8-6172556f9918
Jan 26 09:01:16 compute-1 systemd-machined[154360]: New machine qemu-15-instance-0000002e.
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.049 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[343f5349-e718-4352-bdc9-1d6a9810a377]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.050 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap902c250a-61 in ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.053 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap902c250a-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.053 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[2793bb7c-a656-4118-9e5e-d3575f4dd54f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.054 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[cda4c3dc-93cc-457b-917c-de4968a054f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:16 compute-1 systemd[1]: Started Virtual Machine qemu-15-instance-0000002e.
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.069 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[32b2ba97-6fe1-49cc-89e8-04e1afb1b74e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.098 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[50232209-79da-4175-bac3-51f3c93e8fbe]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.136 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[28632090-c7f7-4544-9c22-7ce474cfaa2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.145 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[1e7c123c-5162-40f5-bae6-16a0fb56d16f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:16 compute-1 NetworkManager[55451]: <info>  [1769418076.1480] manager: (tap902c250a-60): new Veth device (/org/freedesktop/NetworkManager/Devices/91)
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.182 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[1b82f55c-290f-49c9-8901-9fc3511f8b3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.187 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[a0ef6528-a61c-4585-803b-1ada9b966baf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:16 compute-1 NetworkManager[55451]: <info>  [1769418076.2106] device (tap902c250a-60): carrier: link connected
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.214 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[209d0f73-0a8c-4755-b22a-2f6971d2ca2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.230 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[e2582074-dcc6-4d67-9f4e-ba516f381bae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap902c250a-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:31:c2:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441681, 'reachable_time': 23761, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220544, 'error': None, 'target': 'ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.251 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[e549e031-f1d5-468b-8901-84424f8c81a6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe31:c2fc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441681, 'tstamp': 441681}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220545, 'error': None, 'target': 'ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.273 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[a39e5b62-18d1-442b-b753-b7fb41befac5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap902c250a-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:31:c2:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441681, 'reachable_time': 23761, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220546, 'error': None, 'target': 'ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.308 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[41fc33d5-d42a-4199-9a59-9246a15b60ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.384 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[0fd5c300-7bcd-4841-ab1f-827f31340f55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.385 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap902c250a-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.386 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.386 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap902c250a-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.389 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:16 compute-1 NetworkManager[55451]: <info>  [1769418076.3907] manager: (tap902c250a-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Jan 26 09:01:16 compute-1 kernel: tap902c250a-60: entered promiscuous mode
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.396 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.397 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap902c250a-60, col_values=(('external_ids', {'iface-id': 'c9c76eb7-dd25-4862-b654-cdfd8369f343'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.398 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:16 compute-1 ovn_controller[95352]: 2026-01-26T09:01:16Z|00256|binding|INFO|Releasing lport c9c76eb7-dd25-4862-b654-cdfd8369f343 from this chassis (sb_readonly=0)
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.409 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.410 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/902c250a-6b5f-40de-85f8-6172556f9918.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/902c250a-6b5f-40de-85f8-6172556f9918.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.411 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[285e2769-799d-4be3-a1b4-68cc372fa1da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.412 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: global
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-902c250a-6b5f-40de-85f8-6172556f9918
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/902c250a-6b5f-40de-85f8-6172556f9918.pid.haproxy
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID 902c250a-6b5f-40de-85f8-6172556f9918
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 09:01:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:01:16.412 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918', 'env', 'PROCESS_TAG=haproxy-902c250a-6b5f-40de-85f8-6172556f9918', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/902c250a-6b5f-40de-85f8-6172556f9918.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.645 183087 DEBUG nova.virt.libvirt.host [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Removed pending event for a89a5221-3253-49f6-b902-67f973b0690e due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.646 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769418076.645313, a89a5221-3253-49f6-b902-67f973b0690e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.646 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] VM Resumed (Lifecycle Event)
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.652 183087 INFO nova.virt.libvirt.driver [-] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Instance running successfully.
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.653 183087 INFO nova.virt.libvirt.driver [None req-db76064d-3c47-4b09-b528-975b4b54cdb0 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Instance soft rebooted successfully.
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.653 183087 DEBUG nova.compute.manager [None req-db76064d-3c47-4b09-b528-975b4b54cdb0 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.667 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.671 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.731 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] During sync_power_state the instance has a pending task (reboot_started). Skip.
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.732 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769418076.648323, a89a5221-3253-49f6-b902-67f973b0690e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.732 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] VM Started (Lifecycle Event)
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.769 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.773 183087 DEBUG oslo_concurrency.lockutils [None req-db76064d-3c47-4b09-b528-975b4b54cdb0 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 6.716s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.775 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 09:01:16 compute-1 podman[220585]: 2026-01-26 09:01:16.849217084 +0000 UTC m=+0.062958010 container create d56c30bb4fc67cd62a02730b60510b43e5aa6b9fe5143da594e0c245818736df (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 09:01:16 compute-1 systemd[1]: Started libpod-conmon-d56c30bb4fc67cd62a02730b60510b43e5aa6b9fe5143da594e0c245818736df.scope.
Jan 26 09:01:16 compute-1 podman[220585]: 2026-01-26 09:01:16.809175882 +0000 UTC m=+0.022916828 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 09:01:16 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:01:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ee6cca06c9bf9711ab914ab0a71d4124978afcdc358341a3a61f7ea1e2d475e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 09:01:16 compute-1 podman[220585]: 2026-01-26 09:01:16.93898247 +0000 UTC m=+0.152723416 container init d56c30bb4fc67cd62a02730b60510b43e5aa6b9fe5143da594e0c245818736df (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 26 09:01:16 compute-1 podman[220585]: 2026-01-26 09:01:16.946159243 +0000 UTC m=+0.159900169 container start d56c30bb4fc67cd62a02730b60510b43e5aa6b9fe5143da594e0c245818736df (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.946 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:01:16 compute-1 nova_compute[183083]: 2026-01-26 09:01:16.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:01:16 compute-1 neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918[220600]: [NOTICE]   (220604) : New worker (220606) forked
Jan 26 09:01:16 compute-1 neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918[220600]: [NOTICE]   (220604) : Loading success.
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.438 183087 DEBUG nova.compute.manager [req-b75545ea-08a6-481f-8d3d-26ec4ca6f808 req-ec261540-d1a0-4e3e-98b1-1eef0b57b3e6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received event network-vif-plugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.438 183087 DEBUG oslo_concurrency.lockutils [req-b75545ea-08a6-481f-8d3d-26ec4ca6f808 req-ec261540-d1a0-4e3e-98b1-1eef0b57b3e6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "a89a5221-3253-49f6-b902-67f973b0690e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.439 183087 DEBUG oslo_concurrency.lockutils [req-b75545ea-08a6-481f-8d3d-26ec4ca6f808 req-ec261540-d1a0-4e3e-98b1-1eef0b57b3e6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.439 183087 DEBUG oslo_concurrency.lockutils [req-b75545ea-08a6-481f-8d3d-26ec4ca6f808 req-ec261540-d1a0-4e3e-98b1-1eef0b57b3e6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.439 183087 DEBUG nova.compute.manager [req-b75545ea-08a6-481f-8d3d-26ec4ca6f808 req-ec261540-d1a0-4e3e-98b1-1eef0b57b3e6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] No waiting events found dispatching network-vif-plugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.440 183087 WARNING nova.compute.manager [req-b75545ea-08a6-481f-8d3d-26ec4ca6f808 req-ec261540-d1a0-4e3e-98b1-1eef0b57b3e6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received unexpected event network-vif-plugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 for instance with vm_state active and task_state None.
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.440 183087 DEBUG nova.compute.manager [req-b75545ea-08a6-481f-8d3d-26ec4ca6f808 req-ec261540-d1a0-4e3e-98b1-1eef0b57b3e6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received event network-vif-plugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.440 183087 DEBUG oslo_concurrency.lockutils [req-b75545ea-08a6-481f-8d3d-26ec4ca6f808 req-ec261540-d1a0-4e3e-98b1-1eef0b57b3e6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "a89a5221-3253-49f6-b902-67f973b0690e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.440 183087 DEBUG oslo_concurrency.lockutils [req-b75545ea-08a6-481f-8d3d-26ec4ca6f808 req-ec261540-d1a0-4e3e-98b1-1eef0b57b3e6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.441 183087 DEBUG oslo_concurrency.lockutils [req-b75545ea-08a6-481f-8d3d-26ec4ca6f808 req-ec261540-d1a0-4e3e-98b1-1eef0b57b3e6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.441 183087 DEBUG nova.compute.manager [req-b75545ea-08a6-481f-8d3d-26ec4ca6f808 req-ec261540-d1a0-4e3e-98b1-1eef0b57b3e6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] No waiting events found dispatching network-vif-plugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.441 183087 WARNING nova.compute.manager [req-b75545ea-08a6-481f-8d3d-26ec4ca6f808 req-ec261540-d1a0-4e3e-98b1-1eef0b57b3e6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received unexpected event network-vif-plugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 for instance with vm_state active and task_state None.
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.441 183087 DEBUG nova.compute.manager [req-b75545ea-08a6-481f-8d3d-26ec4ca6f808 req-ec261540-d1a0-4e3e-98b1-1eef0b57b3e6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received event network-vif-plugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.442 183087 DEBUG oslo_concurrency.lockutils [req-b75545ea-08a6-481f-8d3d-26ec4ca6f808 req-ec261540-d1a0-4e3e-98b1-1eef0b57b3e6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "a89a5221-3253-49f6-b902-67f973b0690e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.442 183087 DEBUG oslo_concurrency.lockutils [req-b75545ea-08a6-481f-8d3d-26ec4ca6f808 req-ec261540-d1a0-4e3e-98b1-1eef0b57b3e6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.442 183087 DEBUG oslo_concurrency.lockutils [req-b75545ea-08a6-481f-8d3d-26ec4ca6f808 req-ec261540-d1a0-4e3e-98b1-1eef0b57b3e6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.442 183087 DEBUG nova.compute.manager [req-b75545ea-08a6-481f-8d3d-26ec4ca6f808 req-ec261540-d1a0-4e3e-98b1-1eef0b57b3e6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] No waiting events found dispatching network-vif-plugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.443 183087 WARNING nova.compute.manager [req-b75545ea-08a6-481f-8d3d-26ec4ca6f808 req-ec261540-d1a0-4e3e-98b1-1eef0b57b3e6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received unexpected event network-vif-plugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 for instance with vm_state active and task_state None.
Jan 26 09:01:17 compute-1 nova_compute[183083]: 2026-01-26 09:01:17.529 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:18 compute-1 sshd-session[220616]: Accepted publickey for zuul from 38.102.83.66 port 51690 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:01:18 compute-1 systemd-logind[788]: New session 54 of user zuul.
Jan 26 09:01:18 compute-1 systemd[1]: Started Session 54 of User zuul.
Jan 26 09:01:18 compute-1 sshd-session[220616]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:01:18 compute-1 podman[220621]: 2026-01-26 09:01:18.324876262 +0000 UTC m=+0.061943491 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 09:01:18 compute-1 podman[220619]: 2026-01-26 09:01:18.342818249 +0000 UTC m=+0.086424673 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 09:01:18 compute-1 podman[220618]: 2026-01-26 09:01:18.359887412 +0000 UTC m=+0.104277608 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 26 09:01:18 compute-1 nova_compute[183083]: 2026-01-26 09:01:18.382 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:18 compute-1 sshd-session[220682]: Accepted publickey for zuul from 38.102.83.66 port 51700 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:01:18 compute-1 systemd-logind[788]: New session 55 of user zuul.
Jan 26 09:01:18 compute-1 systemd[1]: Started Session 55 of User zuul.
Jan 26 09:01:18 compute-1 sshd-session[220682]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:01:18 compute-1 sudo[220689]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:01:18 compute-1 sudo[220689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:01:18 compute-1 sudo[220689]: pam_unix(sudo:session): session closed for user root
Jan 26 09:01:18 compute-1 sudo[220714]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni eth1 icmp and ether host fa:16:3e:a8:07:55 -w /tmp/tmp.WXzXLVKhJr
Jan 26 09:01:18 compute-1 sudo[220714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:01:18 compute-1 sshd-session[220688]: Connection closed by 38.102.83.66 port 51700
Jan 26 09:01:18 compute-1 sshd-session[220682]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:01:18 compute-1 systemd[1]: session-55.scope: Deactivated successfully.
Jan 26 09:01:18 compute-1 systemd-logind[788]: Session 55 logged out. Waiting for processes to exit.
Jan 26 09:01:18 compute-1 systemd-logind[788]: Removed session 55.
Jan 26 09:01:18 compute-1 nova_compute[183083]: 2026-01-26 09:01:18.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:01:18 compute-1 nova_compute[183083]: 2026-01-26 09:01:18.973 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:01:18 compute-1 nova_compute[183083]: 2026-01-26 09:01:18.973 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:01:18 compute-1 nova_compute[183083]: 2026-01-26 09:01:18.973 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:01:18 compute-1 nova_compute[183083]: 2026-01-26 09:01:18.974 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:01:19 compute-1 nova_compute[183083]: 2026-01-26 09:01:19.033 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:01:19 compute-1 nova_compute[183083]: 2026-01-26 09:01:19.108 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:01:19 compute-1 nova_compute[183083]: 2026-01-26 09:01:19.109 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:01:19 compute-1 nova_compute[183083]: 2026-01-26 09:01:19.164 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:01:19 compute-1 nova_compute[183083]: 2026-01-26 09:01:19.311 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:01:19 compute-1 nova_compute[183083]: 2026-01-26 09:01:19.312 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13541MB free_disk=113.06402969360352GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:01:19 compute-1 nova_compute[183083]: 2026-01-26 09:01:19.313 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:01:19 compute-1 nova_compute[183083]: 2026-01-26 09:01:19.313 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:01:19 compute-1 nova_compute[183083]: 2026-01-26 09:01:19.374 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance a89a5221-3253-49f6-b902-67f973b0690e actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 09:01:19 compute-1 nova_compute[183083]: 2026-01-26 09:01:19.374 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:01:19 compute-1 nova_compute[183083]: 2026-01-26 09:01:19.375 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:01:19 compute-1 nova_compute[183083]: 2026-01-26 09:01:19.410 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:01:19 compute-1 nova_compute[183083]: 2026-01-26 09:01:19.678 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:01:19 compute-1 nova_compute[183083]: 2026-01-26 09:01:19.681 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:01:19 compute-1 nova_compute[183083]: 2026-01-26 09:01:19.681 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.368s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:01:22 compute-1 nova_compute[183083]: 2026-01-26 09:01:22.533 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:23 compute-1 nova_compute[183083]: 2026-01-26 09:01:23.385 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:27 compute-1 nova_compute[183083]: 2026-01-26 09:01:27.536 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:28 compute-1 nova_compute[183083]: 2026-01-26 09:01:28.442 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:29 compute-1 ovn_controller[95352]: 2026-01-26T09:01:29Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f0:4e:56 10.100.0.5
Jan 26 09:01:31 compute-1 podman[220754]: 2026-01-26 09:01:31.781591309 +0000 UTC m=+0.050127298 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:01:32 compute-1 nova_compute[183083]: 2026-01-26 09:01:32.537 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:33 compute-1 nova_compute[183083]: 2026-01-26 09:01:33.446 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:37 compute-1 nova_compute[183083]: 2026-01-26 09:01:37.581 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:38 compute-1 nova_compute[183083]: 2026-01-26 09:01:38.449 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:41 compute-1 sshd-session[220779]: Accepted publickey for zuul from 38.102.83.66 port 40512 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:01:41 compute-1 systemd-logind[788]: New session 56 of user zuul.
Jan 26 09:01:41 compute-1 systemd[1]: Started Session 56 of User zuul.
Jan 26 09:01:41 compute-1 sshd-session[220779]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:01:41 compute-1 sudo[220783]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.WXzXLVKhJr
Jan 26 09:01:41 compute-1 sudo[220783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:01:41 compute-1 sudo[220783]: pam_unix(sudo:session): session closed for user root
Jan 26 09:01:41 compute-1 sshd-session[220809]: Invalid user solv from 2.57.122.238 port 34864
Jan 26 09:01:41 compute-1 sshd-session[220809]: Connection closed by invalid user solv 2.57.122.238 port 34864 [preauth]
Jan 26 09:01:42 compute-1 nova_compute[183083]: 2026-01-26 09:01:42.584 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:43 compute-1 nova_compute[183083]: 2026-01-26 09:01:43.451 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:45 compute-1 ovn_controller[95352]: 2026-01-26T09:01:45Z|00257|memory_trim|INFO|Detected inactivity (last active 30045 ms ago): trimming memory
Jan 26 09:01:45 compute-1 podman[220811]: 2026-01-26 09:01:45.839580611 +0000 UTC m=+0.067559891 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 09:01:45 compute-1 podman[220812]: 2026-01-26 09:01:45.863044824 +0000 UTC m=+0.081228877 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-type=git, config_id=openstack_network_exporter)
Jan 26 09:01:46 compute-1 sshd-session[220850]: Accepted publickey for zuul from 38.102.83.66 port 36824 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:01:46 compute-1 systemd-logind[788]: New session 57 of user zuul.
Jan 26 09:01:46 compute-1 systemd[1]: Started Session 57 of User zuul.
Jan 26 09:01:46 compute-1 sshd-session[220850]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:01:46 compute-1 sudo[220854]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.WXzXLVKhJr
Jan 26 09:01:46 compute-1 sudo[220854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:01:46 compute-1 sudo[220854]: pam_unix(sudo:session): session closed for user root
Jan 26 09:01:47 compute-1 sshd-session[220853]: Connection closed by 38.102.83.66 port 36824
Jan 26 09:01:47 compute-1 sshd-session[220850]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:01:47 compute-1 systemd[1]: session-57.scope: Deactivated successfully.
Jan 26 09:01:47 compute-1 systemd-logind[788]: Session 57 logged out. Waiting for processes to exit.
Jan 26 09:01:47 compute-1 systemd-logind[788]: Removed session 57.
Jan 26 09:01:47 compute-1 nova_compute[183083]: 2026-01-26 09:01:47.589 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:48 compute-1 nova_compute[183083]: 2026-01-26 09:01:48.499 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:48 compute-1 podman[220882]: 2026-01-26 09:01:48.830528369 +0000 UTC m=+0.081050012 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 09:01:48 compute-1 podman[220881]: 2026-01-26 09:01:48.847522609 +0000 UTC m=+0.100857171 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 26 09:01:48 compute-1 podman[220880]: 2026-01-26 09:01:48.867659488 +0000 UTC m=+0.129647135 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:01:49 compute-1 sudo[219716]: pam_unix(sudo:session): session closed for user root
Jan 26 09:01:52 compute-1 nova_compute[183083]: 2026-01-26 09:01:52.592 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:53 compute-1 nova_compute[183083]: 2026-01-26 09:01:53.509 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:53 compute-1 sshd-session[220944]: Accepted publickey for zuul from 38.102.83.66 port 57206 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:01:53 compute-1 systemd-logind[788]: New session 58 of user zuul.
Jan 26 09:01:53 compute-1 systemd[1]: Started Session 58 of User zuul.
Jan 26 09:01:53 compute-1 sshd-session[220944]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:01:54 compute-1 sudo[220948]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.JD3Z5xdJwe
Jan 26 09:01:54 compute-1 sudo[220948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:01:54 compute-1 sudo[220948]: pam_unix(sudo:session): session closed for user root
Jan 26 09:01:54 compute-1 sshd-session[220947]: Connection closed by 38.102.83.66 port 57206
Jan 26 09:01:54 compute-1 sshd-session[220944]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:01:54 compute-1 systemd-logind[788]: Session 58 logged out. Waiting for processes to exit.
Jan 26 09:01:54 compute-1 systemd[1]: session-58.scope: Deactivated successfully.
Jan 26 09:01:54 compute-1 systemd-logind[788]: Removed session 58.
Jan 26 09:01:57 compute-1 nova_compute[183083]: 2026-01-26 09:01:57.594 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:01:58 compute-1 nova_compute[183083]: 2026-01-26 09:01:58.555 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:00 compute-1 nova_compute[183083]: 2026-01-26 09:02:00.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:02:00 compute-1 nova_compute[183083]: 2026-01-26 09:02:00.953 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 26 09:02:01 compute-1 nova_compute[183083]: 2026-01-26 09:02:01.016 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 26 09:02:01 compute-1 sshd-session[220975]: Accepted publickey for zuul from 38.102.83.66 port 57208 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:02:01 compute-1 systemd-logind[788]: New session 59 of user zuul.
Jan 26 09:02:01 compute-1 systemd[1]: Started Session 59 of User zuul.
Jan 26 09:02:01 compute-1 sshd-session[220975]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:02:01 compute-1 sudo[220979]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.X9ASNz5jXN
Jan 26 09:02:01 compute-1 sudo[220979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:02:01 compute-1 sudo[220979]: pam_unix(sudo:session): session closed for user root
Jan 26 09:02:02 compute-1 podman[221003]: 2026-01-26 09:02:02.003161454 +0000 UTC m=+0.095099095 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 09:02:02 compute-1 sshd-session[220978]: Connection closed by 38.102.83.66 port 57208
Jan 26 09:02:02 compute-1 sshd-session[220975]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:02:02 compute-1 systemd[1]: session-59.scope: Deactivated successfully.
Jan 26 09:02:02 compute-1 systemd-logind[788]: Session 59 logged out. Waiting for processes to exit.
Jan 26 09:02:02 compute-1 systemd-logind[788]: Removed session 59.
Jan 26 09:02:02 compute-1 nova_compute[183083]: 2026-01-26 09:02:02.596 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:03 compute-1 nova_compute[183083]: 2026-01-26 09:02:03.558 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.753 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'name': 'tempest-server-test-78344759', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002e', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2580bb16c90849c4b5919eb271774a06', 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'hostId': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.755 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.779 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.780 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f38ff21d-eb2f-4fdd-b3a8-fecf8bd50345', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-vda', 'timestamp': '2026-01-26T09:02:03.755416', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aec68564-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.41744, 'message_signature': '7eb290903b47ab5cfd10a542439e3ea1474d86815cb9b692cd096e220e779e79'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-sda', 'timestamp': '2026-01-26T09:02:03.755416', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aec69e00-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.41744, 'message_signature': '796a6e5ced4b907eb791077c8e806573a9777a7e43aea98879cb0428b6b0eb23'}]}, 'timestamp': '2026-01-26 09:02:03.781177', '_unique_id': '9c289a00665c4ed9bacc491f38ca59b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.783 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.785 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.790 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.incoming.bytes.delta volume: 3849 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11fe5679-9887-456c-bea5-0fa4ee6921f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 3849, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:02:03.785531', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': 'aec83170-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.447591931, 'message_signature': 'da7c5cf136723dd4a284aa44501d7c419161a74398f2e744f383e07a7be552d9'}]}, 'timestamp': '2026-01-26 09:02:03.791440', '_unique_id': 'b3404fb2e96d4d8fb6973db25fbc9f81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.792 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.793 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.825 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.read.latency volume: 281128723 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.826 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.read.latency volume: 21553119 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d4b6e2d-84a9-4e71-a193-72d1389663cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281128723, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-vda', 'timestamp': '2026-01-26T09:02:03.793674', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aecd7c0c-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.45569378, 'message_signature': '913024491ae6592650bc9a8e62b01f268d279eb248f8d4b664a230aeab80cffd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21553119, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-sda', 'timestamp': '2026-01-26T09:02:03.793674', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aecd8a12-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.45569378, 'message_signature': '78946bdfd694e12e294a6c035f0091a95181be5263bb14b4b9466e43fb8f502b'}]}, 'timestamp': '2026-01-26 09:02:03.826372', '_unique_id': '625d9ca735164e7f98f70c549bf8b0b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.827 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.828 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.828 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.incoming.packets volume: 45 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8ea9b19-ae5e-424e-a7e6-dfc29b028045', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 45, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:02:03.828615', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': 'aecdee62-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.447591931, 'message_signature': 'b98e0eb9de2e73588c540bc8be939079185e186d97d8a9b92239a84590fe5bc4'}]}, 'timestamp': '2026-01-26 09:02:03.828968', '_unique_id': 'ede5130a26454623ba599c8b888a637f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.829 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.830 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.849 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/cpu volume: 10790000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '829bca3b-b727-4c14-8482-879e4b22b83b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10790000000, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'timestamp': '2026-01-26T09:02:03.831013', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'aed13a18-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.511546997, 'message_signature': '34992a2f4cb48ada39292c0770ebfb42216946fddab9bc9888090360c9d3691e'}]}, 'timestamp': '2026-01-26 09:02:03.850679', '_unique_id': '47170ca207b4456b924624c353950ee3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.852 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.outgoing.bytes volume: 5408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e399fad-f170-40d2-a6b3-aefbca5bd21f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5408, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:02:03.852924', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': 'aed1a264-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.447591931, 'message_signature': '56dbfa44b351f57f73c27cc225994ed2a643383424cf50fb31f06c7c1968dfc8'}]}, 'timestamp': '2026-01-26 09:02:03.853249', '_unique_id': '323774f8243344eba4c4622c313c049a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.853 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.854 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.854 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.read.bytes volume: 32266240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.854 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f13d74b6-903b-4ba5-b3b3-2e52e41307ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32266240, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-vda', 'timestamp': '2026-01-26T09:02:03.854560', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aed1e314-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.45569378, 'message_signature': '7a546ddd2f2377adc7d4392301d38c58efb972ad967de8b6bdf6912e6001d558'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-sda', 'timestamp': '2026-01-26T09:02:03.854560', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aed1eed6-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.45569378, 'message_signature': '4ef54b3364c8d338924c860482294c4a08436a9071f15fbbdcc6010498e4c013'}]}, 'timestamp': '2026-01-26 09:02:03.855179', '_unique_id': 'bfb1a943fdbd4ba090adb1374f7a9780'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.855 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.856 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.856 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.outgoing.bytes.delta volume: 3928 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07029623-c267-4ab7-97ed-7a32b3003940', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 3928, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:02:03.856563', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': 'aed23008-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.447591931, 'message_signature': '55cdb1f1b8e53879057c30b62c872dd929a98b011e48fdc9c914ae69a0699f03'}]}, 'timestamp': '2026-01-26 09:02:03.856809', '_unique_id': '2de58476a4ae451681da690ebbb387bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.857 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.858 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.858 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.write.requests volume: 38 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.858 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e47eb309-a50f-46df-95cf-8ff9d01c1905', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 38, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-vda', 'timestamp': '2026-01-26T09:02:03.858346', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aed276c6-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.45569378, 'message_signature': 'c1d6de01e02c785f4d4424b665f22628677d2f92a4d4c7845ebf51f74d517ce8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-sda', 'timestamp': '2026-01-26T09:02:03.858346', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aed28198-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.45569378, 'message_signature': '819340be306d386304e623670c07351e9e8febb146d7b4c7beb40e9a50a485fa'}]}, 'timestamp': '2026-01-26 09:02:03.858902', '_unique_id': '5d2e5b0938134842a4a09c927c8175d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.859 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cba778eb-131a-42aa-ad61-3c48bcba1241', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:02:03.860119', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': 'aed2bae6-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.447591931, 'message_signature': 'eca966c37abc55adc1dec3d62f607f3baa4672e0d6314f591b6094746545ca97'}]}, 'timestamp': '2026-01-26 09:02:03.860366', '_unique_id': '15da137e8b774835ba0213a75ac69f6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.860 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.861 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.861 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.incoming.bytes volume: 5407 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a957b51-a69d-49c4-b64b-27df6d9566eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5407, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:02:03.861505', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': 'aed2f0ce-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.447591931, 'message_signature': '691aacf97499233516499c392f05ca813a2da8b4703f7206f232928dddb29ae6'}]}, 'timestamp': '2026-01-26 09:02:03.861736', '_unique_id': '59397117f24c4588b7fce109e84d00b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.862 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '817c86e6-2a81-4c5f-a836-6a534e0f57d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:02:03.862855', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': 'aed3263e-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.447591931, 'message_signature': '9b27421c8a382c93067131a97785cdf0d299b9c57a5b012cc55fd9d5b8eb1ff0'}]}, 'timestamp': '2026-01-26 09:02:03.863124', '_unique_id': '91d9d7294af240f4b06d94b242cc2ef8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.863 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.864 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.864 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eae5bb55-08ce-4d44-881a-42542bc55440', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:02:03.864480', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': 'aed365e0-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.447591931, 'message_signature': '322fa46cecafe98b0bbd74099c71fd9c8231f02b4b508416546a971c6512ee2d'}]}, 'timestamp': '2026-01-26 09:02:03.864771', '_unique_id': 'c069c713d92343e883b23955e3e7232b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.865 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec9c9aa2-16c0-4678-99b2-b8dc93ae27c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:02:03.865919', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': 'aed39d12-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.447591931, 'message_signature': 'bbd69e89130ccbdc08b5c8cab31b1469c755a4f23d13d71031c2ef1c533f2c04'}]}, 'timestamp': '2026-01-26 09:02:03.866167', '_unique_id': '447d7fcd970544fd88beb81b12153a76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.866 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.867 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.867 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.read.requests volume: 1254 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.867 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee70c32d-3329-40d0-b77a-3f2f6b67510d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1254, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-vda', 'timestamp': '2026-01-26T09:02:03.867252', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aed3d110-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.45569378, 'message_signature': 'c6c210b3642056811ccde0746073184cd3227414f72b3a6d807d533c6d24e1ca'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-sda', 'timestamp': '2026-01-26T09:02:03.867252', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aed3d8d6-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.45569378, 'message_signature': '7ec4e6da0d531bde688e7b1c2e1ef7cc61ecdb1344485eb9dd9e56b1795851eb'}]}, 'timestamp': '2026-01-26 09:02:03.867685', '_unique_id': 'fb87f14e45a247c98f0c2715de6e675d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.868 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.allocation volume: 31072256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8df388a5-56a8-4f6b-9239-42562c30e4e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31072256, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-vda', 'timestamp': '2026-01-26T09:02:03.868927', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aed41292-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.41744, 'message_signature': '84a06eef55912f27f6014cdd1d4ffb1a21a7cb9b42f1d757137b67bfc9498d5a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-sda', 'timestamp': '2026-01-26T09:02:03.868927', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aed41ba2-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.41744, 'message_signature': '3f634c9ba2a2065e5d976f7634f3fe172a3f0a7bbbcd97910dafcab7760ec16f'}]}, 'timestamp': '2026-01-26 09:02:03.869407', '_unique_id': 'e190638000994f6da9b9d091ae848779'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.869 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.870 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.870 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/network.outgoing.packets volume: 42 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f3a4edf-f1d9-48f8-8648-8eec96048b9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 42, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002e-a89a5221-3253-49f6-b902-67f973b0690e-tape235b615-3a', 'timestamp': '2026-01-26T09:02:03.870646', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'tape235b615-3a', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:4e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape235b615-3a'}, 'message_id': 'aed455f4-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.447591931, 'message_signature': '6518f9d3629b32550d4645669cd288612a04bcfc45bb4acffcf1ff75608fee2d'}]}, 'timestamp': '2026-01-26 09:02:03.870936', '_unique_id': '78045c21c53f41e496d8f9dfbd1affb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.871 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.872 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.872 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.write.bytes volume: 352256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.872 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad016c82-fc8e-4eb2-8fa2-b1f7c5a624ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 352256, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-vda', 'timestamp': '2026-01-26T09:02:03.872174', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aed49172-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.45569378, 'message_signature': '4243ead1cde46886d1d1cbc0d15fbaabb5edea4354dd319eb48c18a895872295'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-sda', 'timestamp': '2026-01-26T09:02:03.872174', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aed49bcc-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.45569378, 'message_signature': 'f8d00019edccc5bc842a92ceae905ff56f0587be75f02bd48b08180d58fcf03b'}]}, 'timestamp': '2026-01-26 09:02:03.872657', '_unique_id': 'be15130f702f4367b03eed5658a46ede'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.873 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/memory.usage volume: 42.265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb31db72-25df-4565-bbaf-1258b1e58508', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.265625, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'timestamp': '2026-01-26T09:02:03.873926', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'aed4d5e2-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.511546997, 'message_signature': 'a8e791703b57525547f20aa0bc8a47bb8ebcf7ecd49bd133d8e2d32c89b9d19b'}]}, 'timestamp': '2026-01-26 09:02:03.874168', '_unique_id': '2dd258e333cb4c5ab38f790e9cb59e8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.874 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.875 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.875 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.usage volume: 30212096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.875 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5625a2fa-1d6f-4472-bcd1-bf2359a36433', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30212096, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-vda', 'timestamp': '2026-01-26T09:02:03.875549', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aed51642-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.41744, 'message_signature': '0a9a892d42fbca9c256395ad5981994a9b3333611b3095ffbaf0dc5d97f0f52e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-sda', 'timestamp': '2026-01-26T09:02:03.875549', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aed51e44-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.41744, 'message_signature': 'bc16591cb44a3285930d37c841a1206c83196614fbbccba097b9b7f46b2ba56a'}]}, 'timestamp': '2026-01-26 09:02:03.875992', '_unique_id': '1cd125c3697f405b975e46138d7e2729'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.876 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.877 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.877 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.877 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.write.latency volume: 28838629 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.877 12 DEBUG ceilometer.compute.pollsters [-] a89a5221-3253-49f6-b902-67f973b0690e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eee9086d-75e9-4b9a-9610-b30ba58048ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28838629, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-vda', 'timestamp': '2026-01-26T09:02:03.877206', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aed55602-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.45569378, 'message_signature': '6bf08ba54641d4cf324da6074b1685d74f093c73a2ebfb1dbf38489a44cc1f28'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'a89a5221-3253-49f6-b902-67f973b0690e-sda', 'timestamp': '2026-01-26T09:02:03.877206', 'resource_metadata': {'display_name': 'tempest-server-test-78344759', 'name': 'instance-0000002e', 'instance_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aed55e2c-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4464.45569378, 'message_signature': '0921872cf4d8deacc74cab57e7216acd0957b932328e6e4ff383fd9781b83f87'}]}, 'timestamp': '2026-01-26 09:02:03.877629', '_unique_id': 'cbe84dc639dd454391a14ffff8a12de4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:02:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:02:03.878 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:02:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:05.316 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:02:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:05.318 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:02:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:05.319 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:02:07 compute-1 nova_compute[183083]: 2026-01-26 09:02:07.597 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:08 compute-1 nova_compute[183083]: 2026-01-26 09:02:08.561 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:08 compute-1 sshd-session[221035]: Accepted publickey for zuul from 38.102.83.66 port 56204 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:02:08 compute-1 systemd-logind[788]: New session 60 of user zuul.
Jan 26 09:02:08 compute-1 systemd[1]: Started Session 60 of User zuul.
Jan 26 09:02:08 compute-1 sshd-session[221035]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:02:09 compute-1 sudo[221039]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.2f1GEFjLA7
Jan 26 09:02:09 compute-1 sudo[221039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:02:09 compute-1 sudo[221039]: pam_unix(sudo:session): session closed for user root
Jan 26 09:02:09 compute-1 sshd-session[221038]: Connection closed by 38.102.83.66 port 56204
Jan 26 09:02:09 compute-1 sshd-session[221035]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:02:09 compute-1 systemd[1]: session-60.scope: Deactivated successfully.
Jan 26 09:02:09 compute-1 systemd-logind[788]: Session 60 logged out. Waiting for processes to exit.
Jan 26 09:02:09 compute-1 systemd-logind[788]: Removed session 60.
Jan 26 09:02:10 compute-1 nova_compute[183083]: 2026-01-26 09:02:10.015 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:02:10 compute-1 nova_compute[183083]: 2026-01-26 09:02:10.016 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:02:10 compute-1 nova_compute[183083]: 2026-01-26 09:02:10.016 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:02:10 compute-1 nova_compute[183083]: 2026-01-26 09:02:10.711 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:02:10 compute-1 nova_compute[183083]: 2026-01-26 09:02:10.712 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquired lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:02:10 compute-1 nova_compute[183083]: 2026-01-26 09:02:10.712 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 09:02:10 compute-1 nova_compute[183083]: 2026-01-26 09:02:10.712 183087 DEBUG nova.objects.instance [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lazy-loading 'info_cache' on Instance uuid a89a5221-3253-49f6-b902-67f973b0690e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:02:12 compute-1 nova_compute[183083]: 2026-01-26 09:02:12.407 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Updating instance_info_cache with network_info: [{"id": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "address": "fa:16:3e:f0:4e:56", "network": {"id": "902c250a-6b5f-40de-85f8-6172556f9918", "bridge": "br-int", "label": "tempest-test-network--2054667150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape235b615-3a", "ovs_interfaceid": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:02:12 compute-1 nova_compute[183083]: 2026-01-26 09:02:12.534 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Releasing lock "refresh_cache-a89a5221-3253-49f6-b902-67f973b0690e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:02:12 compute-1 nova_compute[183083]: 2026-01-26 09:02:12.535 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 09:02:12 compute-1 nova_compute[183083]: 2026-01-26 09:02:12.535 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:02:12 compute-1 nova_compute[183083]: 2026-01-26 09:02:12.599 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:13 compute-1 nova_compute[183083]: 2026-01-26 09:02:13.565 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:14 compute-1 nova_compute[183083]: 2026-01-26 09:02:14.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:02:14 compute-1 nova_compute[183083]: 2026-01-26 09:02:14.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:02:15 compute-1 nova_compute[183083]: 2026-01-26 09:02:15.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:02:16 compute-1 ovn_controller[95352]: 2026-01-26T09:02:16Z|00258|pinctrl|WARN|Dropped 155 log messages in last 61 seconds (most recently, 17 seconds ago) due to excessive rate
Jan 26 09:02:16 compute-1 ovn_controller[95352]: 2026-01-26T09:02:16Z|00259|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:02:16 compute-1 sshd-session[221065]: Accepted publickey for zuul from 38.102.83.66 port 39360 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:02:16 compute-1 systemd-logind[788]: New session 61 of user zuul.
Jan 26 09:02:16 compute-1 systemd[1]: Started Session 61 of User zuul.
Jan 26 09:02:16 compute-1 sshd-session[221065]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:02:16 compute-1 podman[221069]: 2026-01-26 09:02:16.473993921 +0000 UTC m=+0.075811951 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, vcs-type=git, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Jan 26 09:02:16 compute-1 podman[221067]: 2026-01-26 09:02:16.51222754 +0000 UTC m=+0.114039880 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:02:16 compute-1 sudo[221109]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.CpSDd7oLQ5
Jan 26 09:02:16 compute-1 sudo[221109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:02:16 compute-1 sudo[221109]: pam_unix(sudo:session): session closed for user root
Jan 26 09:02:16 compute-1 sshd-session[221090]: Connection closed by 38.102.83.66 port 39360
Jan 26 09:02:16 compute-1 sshd-session[221065]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:02:16 compute-1 systemd[1]: session-61.scope: Deactivated successfully.
Jan 26 09:02:16 compute-1 systemd-logind[788]: Session 61 logged out. Waiting for processes to exit.
Jan 26 09:02:16 compute-1 systemd-logind[788]: Removed session 61.
Jan 26 09:02:16 compute-1 nova_compute[183083]: 2026-01-26 09:02:16.946 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:02:16 compute-1 nova_compute[183083]: 2026-01-26 09:02:16.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:02:17 compute-1 nova_compute[183083]: 2026-01-26 09:02:17.600 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:17 compute-1 nova_compute[183083]: 2026-01-26 09:02:17.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:02:17 compute-1 nova_compute[183083]: 2026-01-26 09:02:17.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:02:18 compute-1 nova_compute[183083]: 2026-01-26 09:02:18.568 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:18 compute-1 nova_compute[183083]: 2026-01-26 09:02:18.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:02:18 compute-1 nova_compute[183083]: 2026-01-26 09:02:18.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.369 183087 DEBUG oslo_concurrency.lockutils [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "a89a5221-3253-49f6-b902-67f973b0690e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.369 183087 DEBUG oslo_concurrency.lockutils [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.370 183087 DEBUG oslo_concurrency.lockutils [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "a89a5221-3253-49f6-b902-67f973b0690e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.371 183087 DEBUG oslo_concurrency.lockutils [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.371 183087 DEBUG oslo_concurrency.lockutils [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.373 183087 INFO nova.compute.manager [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Terminating instance
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.374 183087 DEBUG nova.compute.manager [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 09:02:19 compute-1 kernel: tape235b615-3a (unregistering): left promiscuous mode
Jan 26 09:02:19 compute-1 NetworkManager[55451]: <info>  [1769418139.4047] device (tape235b615-3a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 09:02:19 compute-1 ovn_controller[95352]: 2026-01-26T09:02:19Z|00260|binding|INFO|Releasing lport e235b615-3ab0-49d4-9c0d-a4d905192bd6 from this chassis (sb_readonly=0)
Jan 26 09:02:19 compute-1 ovn_controller[95352]: 2026-01-26T09:02:19Z|00261|binding|INFO|Setting lport e235b615-3ab0-49d4-9c0d-a4d905192bd6 down in Southbound
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.415 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:19 compute-1 ovn_controller[95352]: 2026-01-26T09:02:19Z|00262|binding|INFO|Removing iface tape235b615-3a ovn-installed in OVS
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.417 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:19.430 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:56 10.100.0.5'], port_security=['fa:16:3e:f0:4e:56 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a89a5221-3253-49f6-b902-67f973b0690e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-902c250a-6b5f-40de-85f8-6172556f9918', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2580bb16c90849c4b5919eb271774a06', 'neutron:revision_number': '6', 'neutron:security_group_ids': '00734c5e-2a15-43b1-a106-6b4708879098', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.184', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9161e928-a360-45a0-86d0-eb6f299d1fc7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=e235b615-3ab0-49d4-9c0d-a4d905192bd6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:02:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:19.432 104632 INFO neutron.agent.ovn.metadata.agent [-] Port e235b615-3ab0-49d4-9c0d-a4d905192bd6 in datapath 902c250a-6b5f-40de-85f8-6172556f9918 unbound from our chassis
Jan 26 09:02:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:19.433 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 902c250a-6b5f-40de-85f8-6172556f9918, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 09:02:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:19.434 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[eaab4a37-e819-4b2a-b6b7-e8473b034d7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:02:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:19.435 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918 namespace which is not needed anymore
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.449 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:19 compute-1 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000002e.scope: Deactivated successfully.
Jan 26 09:02:19 compute-1 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000002e.scope: Consumed 14.253s CPU time.
Jan 26 09:02:19 compute-1 systemd-machined[154360]: Machine qemu-15-instance-0000002e terminated.
Jan 26 09:02:19 compute-1 podman[221140]: 2026-01-26 09:02:19.511557368 +0000 UTC m=+0.065914111 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 09:02:19 compute-1 podman[221139]: 2026-01-26 09:02:19.523181257 +0000 UTC m=+0.081055770 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 26 09:02:19 compute-1 neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918[220600]: [NOTICE]   (220604) : haproxy version is 2.8.14-c23fe91
Jan 26 09:02:19 compute-1 neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918[220600]: [NOTICE]   (220604) : path to executable is /usr/sbin/haproxy
Jan 26 09:02:19 compute-1 neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918[220600]: [WARNING]  (220604) : Exiting Master process...
Jan 26 09:02:19 compute-1 neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918[220600]: [ALERT]    (220604) : Current worker (220606) exited with code 143 (Terminated)
Jan 26 09:02:19 compute-1 neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918[220600]: [WARNING]  (220604) : All workers exited. Exiting... (0)
Jan 26 09:02:19 compute-1 systemd[1]: libpod-d56c30bb4fc67cd62a02730b60510b43e5aa6b9fe5143da594e0c245818736df.scope: Deactivated successfully.
Jan 26 09:02:19 compute-1 podman[221219]: 2026-01-26 09:02:19.594899861 +0000 UTC m=+0.050092955 container died d56c30bb4fc67cd62a02730b60510b43e5aa6b9fe5143da594e0c245818736df (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:02:19 compute-1 podman[221138]: 2026-01-26 09:02:19.626982527 +0000 UTC m=+0.192153695 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 09:02:19 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d56c30bb4fc67cd62a02730b60510b43e5aa6b9fe5143da594e0c245818736df-userdata-shm.mount: Deactivated successfully.
Jan 26 09:02:19 compute-1 systemd[1]: var-lib-containers-storage-overlay-4ee6cca06c9bf9711ab914ab0a71d4124978afcdc358341a3a61f7ea1e2d475e-merged.mount: Deactivated successfully.
Jan 26 09:02:19 compute-1 podman[221219]: 2026-01-26 09:02:19.638944855 +0000 UTC m=+0.094137979 container cleanup d56c30bb4fc67cd62a02730b60510b43e5aa6b9fe5143da594e0c245818736df (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.642 183087 INFO nova.virt.libvirt.driver [-] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Instance destroyed successfully.
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.643 183087 DEBUG nova.objects.instance [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lazy-loading 'resources' on Instance uuid a89a5221-3253-49f6-b902-67f973b0690e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:02:19 compute-1 systemd[1]: libpod-conmon-d56c30bb4fc67cd62a02730b60510b43e5aa6b9fe5143da594e0c245818736df.scope: Deactivated successfully.
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.659 183087 DEBUG nova.virt.libvirt.vif [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T08:59:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-78344759',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-78344759',id=46,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBCdzwlLhB6FbGto6B+9xUIGqDD4Nkmb9VT0Y4hVPwUFt+Pnvt+qUg7O8mk7/7/EcjP1qy2gKBsKzD7Rm3HgaxefrKiDsUNb9XIKWFAVeuw+MyxY4GcVzoBujypjmrmUqA==',key_name='tempest-keypair-1601993815',keypairs=<?>,launch_index=0,launched_at=2026-01-26T08:59:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2580bb16c90849c4b5919eb271774a06',ramdisk_id='',reservation_id='r-008vt5ta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-691788706',owner_user_name='tempest-OvnDvrTest-691788706-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T09:01:16Z,user_data=None,user_id='90104736f4ab4d81b09d1ff11e40f454',uuid=a89a5221-3253-49f6-b902-67f973b0690e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "address": "fa:16:3e:f0:4e:56", "network": {"id": "902c250a-6b5f-40de-85f8-6172556f9918", "bridge": "br-int", "label": "tempest-test-network--2054667150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape235b615-3a", "ovs_interfaceid": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.659 183087 DEBUG nova.network.os_vif_util [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converting VIF {"id": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "address": "fa:16:3e:f0:4e:56", "network": {"id": "902c250a-6b5f-40de-85f8-6172556f9918", "bridge": "br-int", "label": "tempest-test-network--2054667150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape235b615-3a", "ovs_interfaceid": "e235b615-3ab0-49d4-9c0d-a4d905192bd6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.660 183087 DEBUG nova.network.os_vif_util [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f0:4e:56,bridge_name='br-int',has_traffic_filtering=True,id=e235b615-3ab0-49d4-9c0d-a4d905192bd6,network=Network(902c250a-6b5f-40de-85f8-6172556f9918),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape235b615-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.660 183087 DEBUG os_vif [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:4e:56,bridge_name='br-int',has_traffic_filtering=True,id=e235b615-3ab0-49d4-9c0d-a4d905192bd6,network=Network(902c250a-6b5f-40de-85f8-6172556f9918),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape235b615-3a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.662 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.663 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape235b615-3a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.667 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.669 183087 INFO os_vif [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:4e:56,bridge_name='br-int',has_traffic_filtering=True,id=e235b615-3ab0-49d4-9c0d-a4d905192bd6,network=Network(902c250a-6b5f-40de-85f8-6172556f9918),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape235b615-3a')
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.670 183087 INFO nova.virt.libvirt.driver [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Deleting instance files /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e_del
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.671 183087 INFO nova.virt.libvirt.driver [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Deletion of /var/lib/nova/instances/a89a5221-3253-49f6-b902-67f973b0690e_del complete
Jan 26 09:02:19 compute-1 podman[221265]: 2026-01-26 09:02:19.696736426 +0000 UTC m=+0.035457792 container remove d56c30bb4fc67cd62a02730b60510b43e5aa6b9fe5143da594e0c245818736df (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 09:02:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:19.701 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[239fd50f-39eb-429f-bd03-37ea7feff986]: (4, ('Mon Jan 26 09:02:19 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918 (d56c30bb4fc67cd62a02730b60510b43e5aa6b9fe5143da594e0c245818736df)\nd56c30bb4fc67cd62a02730b60510b43e5aa6b9fe5143da594e0c245818736df\nMon Jan 26 09:02:19 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918 (d56c30bb4fc67cd62a02730b60510b43e5aa6b9fe5143da594e0c245818736df)\nd56c30bb4fc67cd62a02730b60510b43e5aa6b9fe5143da594e0c245818736df\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:02:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:19.703 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[67138071-117c-4d60-b611-05c55c9c1699]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:02:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:19.704 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap902c250a-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.705 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:19 compute-1 kernel: tap902c250a-60: left promiscuous mode
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.707 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:19.709 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[25a41daf-e1fb-438f-b4b1-2bf93ddd4576]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.718 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:19.727 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[56f8c311-416c-4dae-8d09-d3d347d92a3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:02:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:19.728 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[c0111b91-f486-4786-841c-c253a2b885b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.729 183087 INFO nova.compute.manager [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.730 183087 DEBUG oslo.service.loopingcall [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.730 183087 DEBUG nova.compute.manager [-] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.731 183087 DEBUG nova.network.neutron [-] [instance: a89a5221-3253-49f6-b902-67f973b0690e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.739 183087 DEBUG nova.compute.manager [req-61bd9c44-9c76-4e0d-bf00-daf4ae6a3abc req-2f9eb330-74d1-4382-9b63-8fe4b7932cf6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received event network-vif-unplugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.740 183087 DEBUG oslo_concurrency.lockutils [req-61bd9c44-9c76-4e0d-bf00-daf4ae6a3abc req-2f9eb330-74d1-4382-9b63-8fe4b7932cf6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "a89a5221-3253-49f6-b902-67f973b0690e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.740 183087 DEBUG oslo_concurrency.lockutils [req-61bd9c44-9c76-4e0d-bf00-daf4ae6a3abc req-2f9eb330-74d1-4382-9b63-8fe4b7932cf6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.740 183087 DEBUG oslo_concurrency.lockutils [req-61bd9c44-9c76-4e0d-bf00-daf4ae6a3abc req-2f9eb330-74d1-4382-9b63-8fe4b7932cf6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.741 183087 DEBUG nova.compute.manager [req-61bd9c44-9c76-4e0d-bf00-daf4ae6a3abc req-2f9eb330-74d1-4382-9b63-8fe4b7932cf6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] No waiting events found dispatching network-vif-unplugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.741 183087 DEBUG nova.compute.manager [req-61bd9c44-9c76-4e0d-bf00-daf4ae6a3abc req-2f9eb330-74d1-4382-9b63-8fe4b7932cf6 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received event network-vif-unplugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 09:02:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:19.745 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[544370df-ccaf-4524-8f94-0b09fc138bbb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441673, 'reachable_time': 25089, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221282, 'error': None, 'target': 'ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:02:19 compute-1 systemd[1]: run-netns-ovnmeta\x2d902c250a\x2d6b5f\x2d40de\x2d85f8\x2d6172556f9918.mount: Deactivated successfully.
Jan 26 09:02:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:19.750 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-902c250a-6b5f-40de-85f8-6172556f9918 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 09:02:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:19.750 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[8784ba1b-7877-4154-8df0-35a12b6b388a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:02:19 compute-1 nova_compute[183083]: 2026-01-26 09:02:19.868 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:19.869 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:02:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:19.871 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:02:20 compute-1 nova_compute[183083]: 2026-01-26 09:02:20.909 183087 DEBUG nova.network.neutron [-] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:02:20 compute-1 nova_compute[183083]: 2026-01-26 09:02:20.929 183087 INFO nova.compute.manager [-] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Took 1.20 seconds to deallocate network for instance.
Jan 26 09:02:20 compute-1 nova_compute[183083]: 2026-01-26 09:02:20.989 183087 DEBUG oslo_concurrency.lockutils [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:02:20 compute-1 nova_compute[183083]: 2026-01-26 09:02:20.990 183087 DEBUG oslo_concurrency.lockutils [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:02:20 compute-1 nova_compute[183083]: 2026-01-26 09:02:20.998 183087 DEBUG nova.compute.manager [req-b663fdeb-70dd-4911-b73e-4365ae3d1e71 req-b5a00027-5928-4b39-b77c-27fd54264355 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received event network-vif-deleted-e235b615-3ab0-49d4-9c0d-a4d905192bd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.007 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.031 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.125 183087 DEBUG nova.compute.provider_tree [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.348 183087 DEBUG nova.scheduler.client.report [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.367 183087 DEBUG oslo_concurrency.lockutils [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.377s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.370 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.370 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.370 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.437 183087 INFO nova.scheduler.client.report [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Deleted allocations for instance a89a5221-3253-49f6-b902-67f973b0690e
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.506 183087 DEBUG oslo_concurrency.lockutils [None req-969c960e-1faf-4a87-a4c0-db3a1de390c6 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.615 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.617 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13703MB free_disk=113.09367370605469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.618 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.618 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.672 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.673 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.696 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.709 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.729 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.730 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.845 183087 DEBUG nova.compute.manager [req-ad38903a-a2e4-4aaf-a1cf-ea0cf02a6efc req-c0e71966-7955-416d-b0a6-00e9713133ac 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received event network-vif-plugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.845 183087 DEBUG oslo_concurrency.lockutils [req-ad38903a-a2e4-4aaf-a1cf-ea0cf02a6efc req-c0e71966-7955-416d-b0a6-00e9713133ac 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "a89a5221-3253-49f6-b902-67f973b0690e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.846 183087 DEBUG oslo_concurrency.lockutils [req-ad38903a-a2e4-4aaf-a1cf-ea0cf02a6efc req-c0e71966-7955-416d-b0a6-00e9713133ac 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.846 183087 DEBUG oslo_concurrency.lockutils [req-ad38903a-a2e4-4aaf-a1cf-ea0cf02a6efc req-c0e71966-7955-416d-b0a6-00e9713133ac 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "a89a5221-3253-49f6-b902-67f973b0690e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.847 183087 DEBUG nova.compute.manager [req-ad38903a-a2e4-4aaf-a1cf-ea0cf02a6efc req-c0e71966-7955-416d-b0a6-00e9713133ac 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] No waiting events found dispatching network-vif-plugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:02:21 compute-1 nova_compute[183083]: 2026-01-26 09:02:21.847 183087 WARNING nova.compute.manager [req-ad38903a-a2e4-4aaf-a1cf-ea0cf02a6efc req-c0e71966-7955-416d-b0a6-00e9713133ac 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Received unexpected event network-vif-plugged-e235b615-3ab0-49d4-9c0d-a4d905192bd6 for instance with vm_state deleted and task_state None.
Jan 26 09:02:22 compute-1 nova_compute[183083]: 2026-01-26 09:02:22.602 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:23 compute-1 sudo[219958]: pam_unix(sudo:session): session closed for user root
Jan 26 09:02:24 compute-1 nova_compute[183083]: 2026-01-26 09:02:24.667 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:26 compute-1 nova_compute[183083]: 2026-01-26 09:02:26.648 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:27 compute-1 nova_compute[183083]: 2026-01-26 09:02:27.604 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:27 compute-1 nova_compute[183083]: 2026-01-26 09:02:27.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:02:29 compute-1 nova_compute[183083]: 2026-01-26 09:02:29.721 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:02:29.873 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:02:32 compute-1 nova_compute[183083]: 2026-01-26 09:02:32.606 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:32 compute-1 podman[221284]: 2026-01-26 09:02:32.8128609 +0000 UTC m=+0.072680113 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 09:02:34 compute-1 nova_compute[183083]: 2026-01-26 09:02:34.641 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769418139.6378703, a89a5221-3253-49f6-b902-67f973b0690e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:02:34 compute-1 nova_compute[183083]: 2026-01-26 09:02:34.641 183087 INFO nova.compute.manager [-] [instance: a89a5221-3253-49f6-b902-67f973b0690e] VM Stopped (Lifecycle Event)
Jan 26 09:02:34 compute-1 nova_compute[183083]: 2026-01-26 09:02:34.660 183087 DEBUG nova.compute.manager [None req-463accd8-d903-44d1-9623-bd8b2b58696d - - - - - -] [instance: a89a5221-3253-49f6-b902-67f973b0690e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:02:34 compute-1 nova_compute[183083]: 2026-01-26 09:02:34.724 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:36 compute-1 nova_compute[183083]: 2026-01-26 09:02:36.757 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:37 compute-1 nova_compute[183083]: 2026-01-26 09:02:37.609 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:39 compute-1 nova_compute[183083]: 2026-01-26 09:02:39.728 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:39 compute-1 sudo[220072]: pam_unix(sudo:session): session closed for user root
Jan 26 09:02:42 compute-1 nova_compute[183083]: 2026-01-26 09:02:42.610 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:44 compute-1 nova_compute[183083]: 2026-01-26 09:02:44.732 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:46 compute-1 podman[221308]: 2026-01-26 09:02:46.813969698 +0000 UTC m=+0.072324582 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 26 09:02:46 compute-1 podman[221309]: 2026-01-26 09:02:46.830817534 +0000 UTC m=+0.078614980 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_id=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Jan 26 09:02:47 compute-1 nova_compute[183083]: 2026-01-26 09:02:47.612 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:49 compute-1 nova_compute[183083]: 2026-01-26 09:02:49.734 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:49 compute-1 podman[221352]: 2026-01-26 09:02:49.802679828 +0000 UTC m=+0.063086072 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 09:02:49 compute-1 podman[221351]: 2026-01-26 09:02:49.802665277 +0000 UTC m=+0.067090905 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 26 09:02:49 compute-1 podman[221350]: 2026-01-26 09:02:49.883423607 +0000 UTC m=+0.144016757 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 09:02:52 compute-1 nova_compute[183083]: 2026-01-26 09:02:52.615 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:54 compute-1 nova_compute[183083]: 2026-01-26 09:02:54.738 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.034 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "d11cfdba-71df-4d49-a60f-29397352c308" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.035 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.059 183087 DEBUG nova.compute.manager [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.143 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.143 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.153 183087 DEBUG nova.virt.hardware [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.153 183087 INFO nova.compute.claims [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Claim successful on node compute-1.ctlplane.example.com
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.253 183087 DEBUG nova.compute.provider_tree [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.269 183087 DEBUG nova.scheduler.client.report [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.291 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.292 183087 DEBUG nova.compute.manager [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.328 183087 DEBUG nova.compute.manager [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.329 183087 DEBUG nova.network.neutron [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.366 183087 INFO nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.453 183087 DEBUG nova.compute.manager [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.561 183087 DEBUG nova.compute.manager [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.564 183087 DEBUG nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.564 183087 INFO nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Creating image(s)
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.566 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "/var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.566 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "/var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.567 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "/var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.583 183087 DEBUG oslo_concurrency.processutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.618 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.657 183087 DEBUG oslo_concurrency.processutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.658 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.659 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.676 183087 DEBUG oslo_concurrency.processutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.747 183087 DEBUG oslo_concurrency.processutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.748 183087 DEBUG oslo_concurrency.processutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.801 183087 DEBUG oslo_concurrency.processutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.802 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.803 183087 DEBUG oslo_concurrency.processutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.889 183087 DEBUG oslo_concurrency.processutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.890 183087 DEBUG nova.virt.disk.api [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Checking if we can resize image /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.890 183087 DEBUG oslo_concurrency.processutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.977 183087 DEBUG oslo_concurrency.processutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.978 183087 DEBUG nova.virt.disk.api [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Cannot resize image /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.979 183087 DEBUG nova.objects.instance [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lazy-loading 'migration_context' on Instance uuid d11cfdba-71df-4d49-a60f-29397352c308 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.996 183087 DEBUG nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.997 183087 DEBUG nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Ensure instance console log exists: /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.998 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.998 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:02:57 compute-1 nova_compute[183083]: 2026-01-26 09:02:57.999 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:02:58 compute-1 nova_compute[183083]: 2026-01-26 09:02:58.306 183087 DEBUG nova.network.neutron [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Successfully created port: 73d02031-13f7-436d-bec5-c981a5c6c99b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 09:02:59 compute-1 nova_compute[183083]: 2026-01-26 09:02:59.473 183087 DEBUG nova.network.neutron [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Successfully updated port: 73d02031-13f7-436d-bec5-c981a5c6c99b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 09:02:59 compute-1 nova_compute[183083]: 2026-01-26 09:02:59.566 183087 DEBUG nova.compute.manager [req-32bc79db-fe8f-4355-a702-b833f87347f4 req-30c9f03b-b6b2-498f-8156-71506abd7caa 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received event network-changed-73d02031-13f7-436d-bec5-c981a5c6c99b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:02:59 compute-1 nova_compute[183083]: 2026-01-26 09:02:59.566 183087 DEBUG nova.compute.manager [req-32bc79db-fe8f-4355-a702-b833f87347f4 req-30c9f03b-b6b2-498f-8156-71506abd7caa 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Refreshing instance network info cache due to event network-changed-73d02031-13f7-436d-bec5-c981a5c6c99b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:02:59 compute-1 nova_compute[183083]: 2026-01-26 09:02:59.567 183087 DEBUG oslo_concurrency.lockutils [req-32bc79db-fe8f-4355-a702-b833f87347f4 req-30c9f03b-b6b2-498f-8156-71506abd7caa 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-d11cfdba-71df-4d49-a60f-29397352c308" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:02:59 compute-1 nova_compute[183083]: 2026-01-26 09:02:59.567 183087 DEBUG oslo_concurrency.lockutils [req-32bc79db-fe8f-4355-a702-b833f87347f4 req-30c9f03b-b6b2-498f-8156-71506abd7caa 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-d11cfdba-71df-4d49-a60f-29397352c308" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:02:59 compute-1 nova_compute[183083]: 2026-01-26 09:02:59.567 183087 DEBUG nova.network.neutron [req-32bc79db-fe8f-4355-a702-b833f87347f4 req-30c9f03b-b6b2-498f-8156-71506abd7caa 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Refreshing network info cache for port 73d02031-13f7-436d-bec5-c981a5c6c99b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:02:59 compute-1 nova_compute[183083]: 2026-01-26 09:02:59.570 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "refresh_cache-d11cfdba-71df-4d49-a60f-29397352c308" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:02:59 compute-1 sudo[220269]: pam_unix(sudo:session): session closed for user root
Jan 26 09:02:59 compute-1 nova_compute[183083]: 2026-01-26 09:02:59.722 183087 DEBUG nova.network.neutron [req-32bc79db-fe8f-4355-a702-b833f87347f4 req-30c9f03b-b6b2-498f-8156-71506abd7caa 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 09:02:59 compute-1 nova_compute[183083]: 2026-01-26 09:02:59.740 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:02:59 compute-1 nova_compute[183083]: 2026-01-26 09:02:59.940 183087 DEBUG nova.network.neutron [req-32bc79db-fe8f-4355-a702-b833f87347f4 req-30c9f03b-b6b2-498f-8156-71506abd7caa 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:02:59 compute-1 nova_compute[183083]: 2026-01-26 09:02:59.954 183087 DEBUG oslo_concurrency.lockutils [req-32bc79db-fe8f-4355-a702-b833f87347f4 req-30c9f03b-b6b2-498f-8156-71506abd7caa 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-d11cfdba-71df-4d49-a60f-29397352c308" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:02:59 compute-1 nova_compute[183083]: 2026-01-26 09:02:59.955 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquired lock "refresh_cache-d11cfdba-71df-4d49-a60f-29397352c308" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:02:59 compute-1 nova_compute[183083]: 2026-01-26 09:02:59.955 183087 DEBUG nova.network.neutron [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.090 183087 DEBUG nova.network.neutron [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.658 183087 DEBUG nova.network.neutron [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Updating instance_info_cache with network_info: [{"id": "73d02031-13f7-436d-bec5-c981a5c6c99b", "address": "fa:16:3e:b0:5c:01", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73d02031-13", "ovs_interfaceid": "73d02031-13f7-436d-bec5-c981a5c6c99b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.678 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Releasing lock "refresh_cache-d11cfdba-71df-4d49-a60f-29397352c308" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.678 183087 DEBUG nova.compute.manager [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Instance network_info: |[{"id": "73d02031-13f7-436d-bec5-c981a5c6c99b", "address": "fa:16:3e:b0:5c:01", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73d02031-13", "ovs_interfaceid": "73d02031-13f7-436d-bec5-c981a5c6c99b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.682 183087 DEBUG nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Start _get_guest_xml network_info=[{"id": "73d02031-13f7-436d-bec5-c981a5c6c99b", "address": "fa:16:3e:b0:5c:01", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73d02031-13", "ovs_interfaceid": "73d02031-13f7-436d-bec5-c981a5c6c99b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encryption_options': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.686 183087 WARNING nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.691 183087 DEBUG nova.virt.libvirt.host [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.691 183087 DEBUG nova.virt.libvirt.host [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.694 183087 DEBUG nova.virt.libvirt.host [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.695 183087 DEBUG nova.virt.libvirt.host [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.695 183087 DEBUG nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.695 183087 DEBUG nova.virt.hardware [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T08:43:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7017323-cd35-470d-a718-faa6c6e97277',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.696 183087 DEBUG nova.virt.hardware [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.696 183087 DEBUG nova.virt.hardware [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.696 183087 DEBUG nova.virt.hardware [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.697 183087 DEBUG nova.virt.hardware [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.697 183087 DEBUG nova.virt.hardware [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.697 183087 DEBUG nova.virt.hardware [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.698 183087 DEBUG nova.virt.hardware [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.698 183087 DEBUG nova.virt.hardware [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.698 183087 DEBUG nova.virt.hardware [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.698 183087 DEBUG nova.virt.hardware [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.702 183087 DEBUG nova.virt.libvirt.vif [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T09:02:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-server-test-697225216',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-697225216',id=48,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCVUzFf6wSdyelY/UeY7cqTv8O0no1sfyU2c1QC2Iq0A42faUChCXJr9C1AIPerWETKRC47CWhTspqdq2Y/jF5MXoBNhFYSHEELzhDImF6CTLJqQZ+4txACe/VixshXUeg==',key_name='tempest-keypair-47432938',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2580bb16c90849c4b5919eb271774a06',ramdisk_id='',reservation_id='r-sbe72h54',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDvrTest-691788706',owner_user_name='tempest-OvnDvrTest-691788706-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:02:57Z,user_data=None,user_id='90104736f4ab4d81b09d1ff11e40f454',uuid=d11cfdba-71df-4d49-a60f-29397352c308,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "73d02031-13f7-436d-bec5-c981a5c6c99b", "address": "fa:16:3e:b0:5c:01", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73d02031-13", "ovs_interfaceid": "73d02031-13f7-436d-bec5-c981a5c6c99b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.702 183087 DEBUG nova.network.os_vif_util [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converting VIF {"id": "73d02031-13f7-436d-bec5-c981a5c6c99b", "address": "fa:16:3e:b0:5c:01", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73d02031-13", "ovs_interfaceid": "73d02031-13f7-436d-bec5-c981a5c6c99b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.703 183087 DEBUG nova.network.os_vif_util [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:5c:01,bridge_name='br-int',has_traffic_filtering=True,id=73d02031-13f7-436d-bec5-c981a5c6c99b,network=Network(a7990128-24d3-4373-9624-cea49e6db86a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73d02031-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.704 183087 DEBUG nova.objects.instance [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lazy-loading 'pci_devices' on Instance uuid d11cfdba-71df-4d49-a60f-29397352c308 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.716 183087 DEBUG nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] End _get_guest_xml xml=<domain type="kvm">
Jan 26 09:03:00 compute-1 nova_compute[183083]:   <uuid>d11cfdba-71df-4d49-a60f-29397352c308</uuid>
Jan 26 09:03:00 compute-1 nova_compute[183083]:   <name>instance-00000030</name>
Jan 26 09:03:00 compute-1 nova_compute[183083]:   <memory>131072</memory>
Jan 26 09:03:00 compute-1 nova_compute[183083]:   <vcpu>1</vcpu>
Jan 26 09:03:00 compute-1 nova_compute[183083]:   <metadata>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <nova:name>tempest-server-test-697225216</nova:name>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <nova:creationTime>2026-01-26 09:03:00</nova:creationTime>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <nova:flavor name="m1.nano">
Jan 26 09:03:00 compute-1 nova_compute[183083]:         <nova:memory>128</nova:memory>
Jan 26 09:03:00 compute-1 nova_compute[183083]:         <nova:disk>1</nova:disk>
Jan 26 09:03:00 compute-1 nova_compute[183083]:         <nova:swap>0</nova:swap>
Jan 26 09:03:00 compute-1 nova_compute[183083]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 09:03:00 compute-1 nova_compute[183083]:         <nova:vcpus>1</nova:vcpus>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       </nova:flavor>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <nova:owner>
Jan 26 09:03:00 compute-1 nova_compute[183083]:         <nova:user uuid="90104736f4ab4d81b09d1ff11e40f454">tempest-OvnDvrTest-691788706-project-admin</nova:user>
Jan 26 09:03:00 compute-1 nova_compute[183083]:         <nova:project uuid="2580bb16c90849c4b5919eb271774a06">tempest-OvnDvrTest-691788706</nova:project>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       </nova:owner>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <nova:root type="image" uuid="13d1a20a-8003-4f19-aba7-ccbd9eff9b82"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <nova:ports>
Jan 26 09:03:00 compute-1 nova_compute[183083]:         <nova:port uuid="73d02031-13f7-436d-bec5-c981a5c6c99b">
Jan 26 09:03:00 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       </nova:ports>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     </nova:instance>
Jan 26 09:03:00 compute-1 nova_compute[183083]:   </metadata>
Jan 26 09:03:00 compute-1 nova_compute[183083]:   <sysinfo type="smbios">
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <system>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <entry name="manufacturer">RDO</entry>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <entry name="product">OpenStack Compute</entry>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <entry name="serial">d11cfdba-71df-4d49-a60f-29397352c308</entry>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <entry name="uuid">d11cfdba-71df-4d49-a60f-29397352c308</entry>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <entry name="family">Virtual Machine</entry>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     </system>
Jan 26 09:03:00 compute-1 nova_compute[183083]:   </sysinfo>
Jan 26 09:03:00 compute-1 nova_compute[183083]:   <os>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <boot dev="hd"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <smbios mode="sysinfo"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:   </os>
Jan 26 09:03:00 compute-1 nova_compute[183083]:   <features>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <acpi/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <apic/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <vmcoreinfo/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:   </features>
Jan 26 09:03:00 compute-1 nova_compute[183083]:   <clock offset="utc">
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <timer name="hpet" present="no"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:   </clock>
Jan 26 09:03:00 compute-1 nova_compute[183083]:   <cpu mode="host-model" match="exact">
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:   </cpu>
Jan 26 09:03:00 compute-1 nova_compute[183083]:   <devices>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <disk type="file" device="disk">
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <target dev="vda" bus="virtio"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     </disk>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <disk type="file" device="cdrom">
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk.config"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <target dev="sda" bus="sata"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     </disk>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:b0:5c:01"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <target dev="tap73d02031-13"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     </interface>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <serial type="pty">
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <log file="/var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/console.log" append="off"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     </serial>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <video>
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     </video>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <input type="tablet" bus="usb"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <rng model="virtio">
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <backend model="random">/dev/urandom</backend>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     </rng>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <controller type="usb" index="0"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     <memballoon model="virtio">
Jan 26 09:03:00 compute-1 nova_compute[183083]:       <stats period="10"/>
Jan 26 09:03:00 compute-1 nova_compute[183083]:     </memballoon>
Jan 26 09:03:00 compute-1 nova_compute[183083]:   </devices>
Jan 26 09:03:00 compute-1 nova_compute[183083]: </domain>
Jan 26 09:03:00 compute-1 nova_compute[183083]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.717 183087 DEBUG nova.compute.manager [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Preparing to wait for external event network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.718 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "d11cfdba-71df-4d49-a60f-29397352c308-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.718 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.718 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.719 183087 DEBUG nova.virt.libvirt.vif [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T09:02:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-server-test-697225216',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-697225216',id=48,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCVUzFf6wSdyelY/UeY7cqTv8O0no1sfyU2c1QC2Iq0A42faUChCXJr9C1AIPerWETKRC47CWhTspqdq2Y/jF5MXoBNhFYSHEELzhDImF6CTLJqQZ+4txACe/VixshXUeg==',key_name='tempest-keypair-47432938',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2580bb16c90849c4b5919eb271774a06',ramdisk_id='',reservation_id='r-sbe72h54',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDvrTest-691788706',owner_user_name='tempest-OvnDvrTest-691788706-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:02:57Z,user_data=None,user_id='90104736f4ab4d81b09d1ff11e40f454',uuid=d11cfdba-71df-4d49-a60f-29397352c308,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "73d02031-13f7-436d-bec5-c981a5c6c99b", "address": "fa:16:3e:b0:5c:01", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73d02031-13", "ovs_interfaceid": "73d02031-13f7-436d-bec5-c981a5c6c99b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.719 183087 DEBUG nova.network.os_vif_util [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converting VIF {"id": "73d02031-13f7-436d-bec5-c981a5c6c99b", "address": "fa:16:3e:b0:5c:01", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73d02031-13", "ovs_interfaceid": "73d02031-13f7-436d-bec5-c981a5c6c99b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.720 183087 DEBUG nova.network.os_vif_util [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:5c:01,bridge_name='br-int',has_traffic_filtering=True,id=73d02031-13f7-436d-bec5-c981a5c6c99b,network=Network(a7990128-24d3-4373-9624-cea49e6db86a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73d02031-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.720 183087 DEBUG os_vif [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:5c:01,bridge_name='br-int',has_traffic_filtering=True,id=73d02031-13f7-436d-bec5-c981a5c6c99b,network=Network(a7990128-24d3-4373-9624-cea49e6db86a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73d02031-13') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.721 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.721 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.722 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.726 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.726 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap73d02031-13, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.727 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap73d02031-13, col_values=(('external_ids', {'iface-id': '73d02031-13f7-436d-bec5-c981a5c6c99b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:5c:01', 'vm-uuid': 'd11cfdba-71df-4d49-a60f-29397352c308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.728 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:00 compute-1 NetworkManager[55451]: <info>  [1769418180.7307] manager: (tap73d02031-13): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.731 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.736 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.738 183087 INFO os_vif [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:5c:01,bridge_name='br-int',has_traffic_filtering=True,id=73d02031-13f7-436d-bec5-c981a5c6c99b,network=Network(a7990128-24d3-4373-9624-cea49e6db86a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73d02031-13')
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.779 183087 DEBUG nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.780 183087 DEBUG nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.780 183087 DEBUG nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] No VIF found with MAC fa:16:3e:b0:5c:01, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 09:03:00 compute-1 nova_compute[183083]: 2026-01-26 09:03:00.781 183087 INFO nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Using config drive
Jan 26 09:03:01 compute-1 anacron[29975]: Job `cron.weekly' started
Jan 26 09:03:01 compute-1 anacron[29975]: Job `cron.weekly' terminated
Jan 26 09:03:02 compute-1 nova_compute[183083]: 2026-01-26 09:03:02.621 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:02 compute-1 nova_compute[183083]: 2026-01-26 09:03:02.730 183087 INFO nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Creating config drive at /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk.config
Jan 26 09:03:02 compute-1 nova_compute[183083]: 2026-01-26 09:03:02.740 183087 DEBUG oslo_concurrency.processutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplsg24wrr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:03:02 compute-1 nova_compute[183083]: 2026-01-26 09:03:02.885 183087 DEBUG oslo_concurrency.processutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplsg24wrr" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:03:02 compute-1 kernel: tap73d02031-13: entered promiscuous mode
Jan 26 09:03:02 compute-1 NetworkManager[55451]: <info>  [1769418182.9851] manager: (tap73d02031-13): new Tun device (/org/freedesktop/NetworkManager/Devices/94)
Jan 26 09:03:02 compute-1 ovn_controller[95352]: 2026-01-26T09:03:02Z|00263|binding|INFO|Claiming lport 73d02031-13f7-436d-bec5-c981a5c6c99b for this chassis.
Jan 26 09:03:02 compute-1 ovn_controller[95352]: 2026-01-26T09:03:02Z|00264|binding|INFO|73d02031-13f7-436d-bec5-c981a5c6c99b: Claiming fa:16:3e:b0:5c:01 10.100.0.28
Jan 26 09:03:02 compute-1 nova_compute[183083]: 2026-01-26 09:03:02.989 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:02 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:02.998 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:5c:01 10.100.0.28'], port_security=['fa:16:3e:b0:5c:01 10.100.0.28'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.28/28', 'neutron:device_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a7990128-24d3-4373-9624-cea49e6db86a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2580bb16c90849c4b5919eb271774a06', 'neutron:revision_number': '2', 'neutron:security_group_ids': '30ca606d-e02a-4090-8787-8ade3dd6e02d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6e8b8dd-7a7a-42f2-93d9-90eaac5d38d1, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=73d02031-13f7-436d-bec5-c981a5c6c99b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:02.999 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 73d02031-13f7-436d-bec5-c981a5c6c99b in datapath a7990128-24d3-4373-9624-cea49e6db86a bound to our chassis
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.001 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a7990128-24d3-4373-9624-cea49e6db86a
Jan 26 09:03:03 compute-1 ovn_controller[95352]: 2026-01-26T09:03:03Z|00265|binding|INFO|Setting lport 73d02031-13f7-436d-bec5-c981a5c6c99b ovn-installed in OVS
Jan 26 09:03:03 compute-1 ovn_controller[95352]: 2026-01-26T09:03:03Z|00266|binding|INFO|Setting lport 73d02031-13f7-436d-bec5-c981a5c6c99b up in Southbound
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.005 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.011 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.016 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[79493f85-013e-4790-9ff8-773e586d693d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.017 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa7990128-21 in ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.019 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa7990128-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.019 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[5c2b4097-677b-45f5-987f-d4fce7ad3163]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.020 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[437b1755-618f-455d-9d5b-4e4bb0211851]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.036 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[9529e2d1-9b27-4c21-88b9-20a45344c558]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:03:03 compute-1 systemd-udevd[221468]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 09:03:03 compute-1 systemd-machined[154360]: New machine qemu-16-instance-00000030.
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.063 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[ebca958a-21bf-4af1-921f-f2bab84a30a3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:03:03 compute-1 NetworkManager[55451]: <info>  [1769418183.0665] device (tap73d02031-13): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 09:03:03 compute-1 systemd[1]: Started Virtual Machine qemu-16-instance-00000030.
Jan 26 09:03:03 compute-1 NetworkManager[55451]: <info>  [1769418183.0675] device (tap73d02031-13): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 09:03:03 compute-1 podman[221448]: 2026-01-26 09:03:03.083003708 +0000 UTC m=+0.102085152 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.103 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[6af7e5e3-4110-4ace-a8da-966bc83e80c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:03:03 compute-1 systemd-udevd[221477]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 09:03:03 compute-1 NetworkManager[55451]: <info>  [1769418183.1120] manager: (tapa7990128-20): new Veth device (/org/freedesktop/NetworkManager/Devices/95)
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.111 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[3d11cf43-2793-4f18-91b5-1cd70756821b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.152 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[ace4e801-73c6-45a4-a5c6-9ae067cf2aa2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.156 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[f32d4b86-a52d-4809-ab45-38a844d09ec7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:03:03 compute-1 NetworkManager[55451]: <info>  [1769418183.1853] device (tapa7990128-20): carrier: link connected
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.189 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[05aa11a1-db09-465c-a7bc-fdb044bdbe8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.205 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[eb020ea6-4369-430e-9158-7cc8ebb2c1ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa7990128-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d3:fd:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452379, 'reachable_time': 20812, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221512, 'error': None, 'target': 'ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.226 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[f96ddd7f-c0b4-4568-802d-9c9b413e7d47]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed3:fd3e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 452379, 'tstamp': 452379}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221513, 'error': None, 'target': 'ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.242 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[daa2951e-c073-4498-972e-94d6d437db51]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa7990128-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d3:fd:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452379, 'reachable_time': 20812, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221514, 'error': None, 'target': 'ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.276 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[991dc041-8f1b-44fc-97d6-71750f61f37b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.331 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[bc82cfdf-63d0-425a-9f4c-15ae4e1c3a82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.333 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa7990128-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.333 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.333 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa7990128-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.335 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:03 compute-1 NetworkManager[55451]: <info>  [1769418183.3362] manager: (tapa7990128-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Jan 26 09:03:03 compute-1 kernel: tapa7990128-20: entered promiscuous mode
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.339 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.341 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa7990128-20, col_values=(('external_ids', {'iface-id': 'e4a9eb60-6fc2-4712-a497-5aead39b5136'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.343 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:03 compute-1 ovn_controller[95352]: 2026-01-26T09:03:03Z|00267|binding|INFO|Releasing lport e4a9eb60-6fc2-4712-a497-5aead39b5136 from this chassis (sb_readonly=0)
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.345 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.345 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a7990128-24d3-4373-9624-cea49e6db86a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a7990128-24d3-4373-9624-cea49e6db86a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.356 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.355 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[ea1ea12a-386f-4019-ae67-51ed3aa9e731]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.358 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: global
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-a7990128-24d3-4373-9624-cea49e6db86a
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/a7990128-24d3-4373-9624-cea49e6db86a.pid.haproxy
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID a7990128-24d3-4373-9624-cea49e6db86a
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 09:03:03 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:03.360 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a', 'env', 'PROCESS_TAG=haproxy-a7990128-24d3-4373-9624-cea49e6db86a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a7990128-24d3-4373-9624-cea49e6db86a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.668 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769418183.667204, d11cfdba-71df-4d49-a60f-29397352c308 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.669 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] VM Started (Lifecycle Event)
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.745 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.751 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769418183.6696136, d11cfdba-71df-4d49-a60f-29397352c308 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.751 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] VM Paused (Lifecycle Event)
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.800 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.804 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 09:03:03 compute-1 podman[221553]: 2026-01-26 09:03:03.75362965 +0000 UTC m=+0.040158525 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 09:03:03 compute-1 podman[221553]: 2026-01-26 09:03:03.853718395 +0000 UTC m=+0.140247210 container create da5df07fe6c3add65833c53811e185062c63eb02ff440018e2582eec7ce20cdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.873 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.924 183087 DEBUG nova.compute.manager [req-cc72910d-f6cb-4b09-9964-28aff97947f6 req-22e80aea-e23e-4814-83b2-f93624bfa51e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received event network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.925 183087 DEBUG oslo_concurrency.lockutils [req-cc72910d-f6cb-4b09-9964-28aff97947f6 req-22e80aea-e23e-4814-83b2-f93624bfa51e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "d11cfdba-71df-4d49-a60f-29397352c308-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.926 183087 DEBUG oslo_concurrency.lockutils [req-cc72910d-f6cb-4b09-9964-28aff97947f6 req-22e80aea-e23e-4814-83b2-f93624bfa51e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.927 183087 DEBUG oslo_concurrency.lockutils [req-cc72910d-f6cb-4b09-9964-28aff97947f6 req-22e80aea-e23e-4814-83b2-f93624bfa51e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.928 183087 DEBUG nova.compute.manager [req-cc72910d-f6cb-4b09-9964-28aff97947f6 req-22e80aea-e23e-4814-83b2-f93624bfa51e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Processing event network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.929 183087 DEBUG nova.compute.manager [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.935 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769418183.9340024, d11cfdba-71df-4d49-a60f-29397352c308 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.937 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] VM Resumed (Lifecycle Event)
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.940 183087 DEBUG nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.944 183087 INFO nova.virt.libvirt.driver [-] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Instance spawned successfully.
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.945 183087 DEBUG nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 09:03:03 compute-1 systemd[1]: Started libpod-conmon-da5df07fe6c3add65833c53811e185062c63eb02ff440018e2582eec7ce20cdf.scope.
Jan 26 09:03:03 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.990 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.995 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.998 183087 DEBUG nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.998 183087 DEBUG nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:03:03 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.999 183087 DEBUG nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:03:04 compute-1 nova_compute[183083]: 2026-01-26 09:03:03.999 183087 DEBUG nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:03:04 compute-1 nova_compute[183083]: 2026-01-26 09:03:04.000 183087 DEBUG nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:03:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4530cb42aafbdc3bc2386ae57bae3f0dd63c2ee41fcc33ac3497061bf32f4a1b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 09:03:04 compute-1 nova_compute[183083]: 2026-01-26 09:03:04.000 183087 DEBUG nova.virt.libvirt.driver [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:03:04 compute-1 podman[221553]: 2026-01-26 09:03:04.015368009 +0000 UTC m=+0.301896814 container init da5df07fe6c3add65833c53811e185062c63eb02ff440018e2582eec7ce20cdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 26 09:03:04 compute-1 podman[221553]: 2026-01-26 09:03:04.021772809 +0000 UTC m=+0.308301594 container start da5df07fe6c3add65833c53811e185062c63eb02ff440018e2582eec7ce20cdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 09:03:04 compute-1 nova_compute[183083]: 2026-01-26 09:03:04.030 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 09:03:04 compute-1 neutron-haproxy-ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a[221568]: [NOTICE]   (221572) : New worker (221574) forked
Jan 26 09:03:04 compute-1 neutron-haproxy-ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a[221568]: [NOTICE]   (221572) : Loading success.
Jan 26 09:03:04 compute-1 nova_compute[183083]: 2026-01-26 09:03:04.057 183087 INFO nova.compute.manager [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Took 6.49 seconds to spawn the instance on the hypervisor.
Jan 26 09:03:04 compute-1 nova_compute[183083]: 2026-01-26 09:03:04.057 183087 DEBUG nova.compute.manager [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:03:04 compute-1 nova_compute[183083]: 2026-01-26 09:03:04.118 183087 INFO nova.compute.manager [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Took 7.01 seconds to build instance.
Jan 26 09:03:04 compute-1 nova_compute[183083]: 2026-01-26 09:03:04.135 183087 DEBUG oslo_concurrency.lockutils [None req-fa51b692-dfb4-41f4-9826-89930c92ebae 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:03:04 compute-1 ovn_controller[95352]: 2026-01-26T09:03:04Z|00268|pinctrl|WARN|Dropped 685 log messages in last 49 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 26 09:03:04 compute-1 ovn_controller[95352]: 2026-01-26T09:03:04Z|00269|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:03:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:05.317 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:03:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:05.318 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:03:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:05.319 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:03:05 compute-1 nova_compute[183083]: 2026-01-26 09:03:05.729 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:05 compute-1 nova_compute[183083]: 2026-01-26 09:03:05.995 183087 DEBUG nova.compute.manager [req-f6a3160b-ed20-4e48-becb-1a2d8ec13eb8 req-1306d4d2-42dc-467a-b8b6-7d79a56e9fdd 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received event network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:03:05 compute-1 nova_compute[183083]: 2026-01-26 09:03:05.995 183087 DEBUG oslo_concurrency.lockutils [req-f6a3160b-ed20-4e48-becb-1a2d8ec13eb8 req-1306d4d2-42dc-467a-b8b6-7d79a56e9fdd 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "d11cfdba-71df-4d49-a60f-29397352c308-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:03:05 compute-1 nova_compute[183083]: 2026-01-26 09:03:05.996 183087 DEBUG oslo_concurrency.lockutils [req-f6a3160b-ed20-4e48-becb-1a2d8ec13eb8 req-1306d4d2-42dc-467a-b8b6-7d79a56e9fdd 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:03:05 compute-1 nova_compute[183083]: 2026-01-26 09:03:05.997 183087 DEBUG oslo_concurrency.lockutils [req-f6a3160b-ed20-4e48-becb-1a2d8ec13eb8 req-1306d4d2-42dc-467a-b8b6-7d79a56e9fdd 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:03:05 compute-1 nova_compute[183083]: 2026-01-26 09:03:05.997 183087 DEBUG nova.compute.manager [req-f6a3160b-ed20-4e48-becb-1a2d8ec13eb8 req-1306d4d2-42dc-467a-b8b6-7d79a56e9fdd 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] No waiting events found dispatching network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:03:05 compute-1 nova_compute[183083]: 2026-01-26 09:03:05.997 183087 WARNING nova.compute.manager [req-f6a3160b-ed20-4e48-becb-1a2d8ec13eb8 req-1306d4d2-42dc-467a-b8b6-7d79a56e9fdd 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received unexpected event network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b for instance with vm_state active and task_state None.
Jan 26 09:03:07 compute-1 nova_compute[183083]: 2026-01-26 09:03:07.628 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:08 compute-1 nova_compute[183083]: 2026-01-26 09:03:08.914 183087 DEBUG nova.compute.manager [req-e8065ebd-ab0b-46cd-8806-4a67fde3f146 req-44f55870-1bbe-44bc-a7ed-66e825c0b0b9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received event network-changed-73d02031-13f7-436d-bec5-c981a5c6c99b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:03:08 compute-1 nova_compute[183083]: 2026-01-26 09:03:08.915 183087 DEBUG nova.compute.manager [req-e8065ebd-ab0b-46cd-8806-4a67fde3f146 req-44f55870-1bbe-44bc-a7ed-66e825c0b0b9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Refreshing instance network info cache due to event network-changed-73d02031-13f7-436d-bec5-c981a5c6c99b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:03:08 compute-1 nova_compute[183083]: 2026-01-26 09:03:08.915 183087 DEBUG oslo_concurrency.lockutils [req-e8065ebd-ab0b-46cd-8806-4a67fde3f146 req-44f55870-1bbe-44bc-a7ed-66e825c0b0b9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-d11cfdba-71df-4d49-a60f-29397352c308" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:03:08 compute-1 nova_compute[183083]: 2026-01-26 09:03:08.915 183087 DEBUG oslo_concurrency.lockutils [req-e8065ebd-ab0b-46cd-8806-4a67fde3f146 req-44f55870-1bbe-44bc-a7ed-66e825c0b0b9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-d11cfdba-71df-4d49-a60f-29397352c308" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:03:08 compute-1 nova_compute[183083]: 2026-01-26 09:03:08.916 183087 DEBUG nova.network.neutron [req-e8065ebd-ab0b-46cd-8806-4a67fde3f146 req-44f55870-1bbe-44bc-a7ed-66e825c0b0b9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Refreshing network info cache for port 73d02031-13f7-436d-bec5-c981a5c6c99b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:03:10 compute-1 nova_compute[183083]: 2026-01-26 09:03:10.267 183087 DEBUG nova.network.neutron [req-e8065ebd-ab0b-46cd-8806-4a67fde3f146 req-44f55870-1bbe-44bc-a7ed-66e825c0b0b9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Updated VIF entry in instance network info cache for port 73d02031-13f7-436d-bec5-c981a5c6c99b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:03:10 compute-1 nova_compute[183083]: 2026-01-26 09:03:10.267 183087 DEBUG nova.network.neutron [req-e8065ebd-ab0b-46cd-8806-4a67fde3f146 req-44f55870-1bbe-44bc-a7ed-66e825c0b0b9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Updating instance_info_cache with network_info: [{"id": "73d02031-13f7-436d-bec5-c981a5c6c99b", "address": "fa:16:3e:b0:5c:01", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73d02031-13", "ovs_interfaceid": "73d02031-13f7-436d-bec5-c981a5c6c99b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:03:10 compute-1 nova_compute[183083]: 2026-01-26 09:03:10.287 183087 DEBUG oslo_concurrency.lockutils [req-e8065ebd-ab0b-46cd-8806-4a67fde3f146 req-44f55870-1bbe-44bc-a7ed-66e825c0b0b9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-d11cfdba-71df-4d49-a60f-29397352c308" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:03:10 compute-1 sshd-session[221583]: Accepted publickey for zuul from 38.102.83.66 port 50638 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:03:10 compute-1 systemd-logind[788]: New session 62 of user zuul.
Jan 26 09:03:10 compute-1 systemd[1]: Started Session 62 of User zuul.
Jan 26 09:03:10 compute-1 sshd-session[221583]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:03:10 compute-1 sshd-session[221587]: Accepted publickey for zuul from 38.102.83.66 port 50642 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:03:10 compute-1 systemd-logind[788]: New session 63 of user zuul.
Jan 26 09:03:10 compute-1 systemd[1]: Started Session 63 of User zuul.
Jan 26 09:03:10 compute-1 sshd-session[221587]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:03:10 compute-1 sudo[221591]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:03:10 compute-1 sudo[221591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:03:10 compute-1 sudo[221591]: pam_unix(sudo:session): session closed for user root
Jan 26 09:03:10 compute-1 sudo[221616]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni eth1 icmp and ether host fa:16:3e:b0:ef:ce -w /tmp/tmp.uH6uMfn0Lv
Jan 26 09:03:10 compute-1 sudo[221616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:03:10 compute-1 sshd-session[221590]: Connection closed by 38.102.83.66 port 50642
Jan 26 09:03:10 compute-1 nova_compute[183083]: 2026-01-26 09:03:10.782 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:10 compute-1 sshd-session[221587]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:03:10 compute-1 systemd[1]: session-63.scope: Deactivated successfully.
Jan 26 09:03:10 compute-1 systemd-logind[788]: Session 63 logged out. Waiting for processes to exit.
Jan 26 09:03:10 compute-1 systemd-logind[788]: Removed session 63.
Jan 26 09:03:11 compute-1 nova_compute[183083]: 2026-01-26 09:03:11.967 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:03:11 compute-1 nova_compute[183083]: 2026-01-26 09:03:11.967 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:03:11 compute-1 nova_compute[183083]: 2026-01-26 09:03:11.967 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:03:12 compute-1 nova_compute[183083]: 2026-01-26 09:03:12.093 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "refresh_cache-d11cfdba-71df-4d49-a60f-29397352c308" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:03:12 compute-1 nova_compute[183083]: 2026-01-26 09:03:12.094 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquired lock "refresh_cache-d11cfdba-71df-4d49-a60f-29397352c308" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:03:12 compute-1 nova_compute[183083]: 2026-01-26 09:03:12.094 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 09:03:12 compute-1 nova_compute[183083]: 2026-01-26 09:03:12.094 183087 DEBUG nova.objects.instance [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lazy-loading 'info_cache' on Instance uuid d11cfdba-71df-4d49-a60f-29397352c308 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:03:12 compute-1 nova_compute[183083]: 2026-01-26 09:03:12.629 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:13 compute-1 nova_compute[183083]: 2026-01-26 09:03:13.227 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Updating instance_info_cache with network_info: [{"id": "73d02031-13f7-436d-bec5-c981a5c6c99b", "address": "fa:16:3e:b0:5c:01", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73d02031-13", "ovs_interfaceid": "73d02031-13f7-436d-bec5-c981a5c6c99b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:03:13 compute-1 nova_compute[183083]: 2026-01-26 09:03:13.284 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Releasing lock "refresh_cache-d11cfdba-71df-4d49-a60f-29397352c308" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:03:13 compute-1 nova_compute[183083]: 2026-01-26 09:03:13.284 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 09:03:13 compute-1 nova_compute[183083]: 2026-01-26 09:03:13.285 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:03:14 compute-1 nova_compute[183083]: 2026-01-26 09:03:14.265 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:03:15 compute-1 ovn_controller[95352]: 2026-01-26T09:03:15Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b0:5c:01 10.100.0.28
Jan 26 09:03:15 compute-1 ovn_controller[95352]: 2026-01-26T09:03:15Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b0:5c:01 10.100.0.28
Jan 26 09:03:15 compute-1 nova_compute[183083]: 2026-01-26 09:03:15.787 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:15 compute-1 nova_compute[183083]: 2026-01-26 09:03:15.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:03:15 compute-1 nova_compute[183083]: 2026-01-26 09:03:15.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:03:17 compute-1 nova_compute[183083]: 2026-01-26 09:03:17.631 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:17 compute-1 podman[221653]: 2026-01-26 09:03:17.8225331 +0000 UTC m=+0.086558294 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:03:17 compute-1 podman[221654]: 2026-01-26 09:03:17.826493232 +0000 UTC m=+0.079476134 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc.)
Jan 26 09:03:17 compute-1 nova_compute[183083]: 2026-01-26 09:03:17.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:03:17 compute-1 nova_compute[183083]: 2026-01-26 09:03:17.953 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:03:18 compute-1 sudo[220714]: pam_unix(sudo:session): session closed for user root
Jan 26 09:03:18 compute-1 nova_compute[183083]: 2026-01-26 09:03:18.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:03:19 compute-1 sshd-session[221693]: Accepted publickey for zuul from 38.102.83.66 port 46830 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:03:19 compute-1 systemd-logind[788]: New session 64 of user zuul.
Jan 26 09:03:19 compute-1 systemd[1]: Started Session 64 of User zuul.
Jan 26 09:03:19 compute-1 sshd-session[221693]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:03:19 compute-1 sudo[221697]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.uH6uMfn0Lv
Jan 26 09:03:19 compute-1 nova_compute[183083]: 2026-01-26 09:03:19.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:03:19 compute-1 nova_compute[183083]: 2026-01-26 09:03:19.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:03:19 compute-1 sudo[221697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:03:19 compute-1 sudo[221697]: pam_unix(sudo:session): session closed for user root
Jan 26 09:03:20 compute-1 podman[221723]: 2026-01-26 09:03:20.066197617 +0000 UTC m=+0.086372059 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 09:03:20 compute-1 podman[221722]: 2026-01-26 09:03:20.084928696 +0000 UTC m=+0.108901116 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 09:03:20 compute-1 podman[221721]: 2026-01-26 09:03:20.087374815 +0000 UTC m=+0.115675127 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:03:20 compute-1 nova_compute[183083]: 2026-01-26 09:03:20.824 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:20 compute-1 nova_compute[183083]: 2026-01-26 09:03:20.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:03:20 compute-1 sshd-session[221792]: Accepted publickey for zuul from 38.102.83.66 port 46834 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:03:20 compute-1 nova_compute[183083]: 2026-01-26 09:03:20.978 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:03:20 compute-1 nova_compute[183083]: 2026-01-26 09:03:20.978 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:03:20 compute-1 nova_compute[183083]: 2026-01-26 09:03:20.979 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:03:20 compute-1 nova_compute[183083]: 2026-01-26 09:03:20.979 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:03:20 compute-1 systemd-logind[788]: New session 65 of user zuul.
Jan 26 09:03:20 compute-1 systemd[1]: Started Session 65 of User zuul.
Jan 26 09:03:21 compute-1 sshd-session[221792]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:03:21 compute-1 nova_compute[183083]: 2026-01-26 09:03:21.057 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:03:21 compute-1 nova_compute[183083]: 2026-01-26 09:03:21.139 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:03:21 compute-1 nova_compute[183083]: 2026-01-26 09:03:21.141 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:03:21 compute-1 sshd-session[221797]: Accepted publickey for zuul from 38.102.83.66 port 46848 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:03:21 compute-1 systemd-logind[788]: New session 66 of user zuul.
Jan 26 09:03:21 compute-1 systemd[1]: Started Session 66 of User zuul.
Jan 26 09:03:21 compute-1 sshd-session[221797]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:03:21 compute-1 nova_compute[183083]: 2026-01-26 09:03:21.238 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:03:21 compute-1 sudo[221807]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:03:21 compute-1 sudo[221807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:03:21 compute-1 sudo[221807]: pam_unix(sudo:session): session closed for user root
Jan 26 09:03:21 compute-1 sshd-session[221804]: Connection closed by 38.102.83.66 port 46848
Jan 26 09:03:21 compute-1 sshd-session[221797]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:03:21 compute-1 sudo[221832]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni eth1 icmp and ether host fa:16:3e:83:0d:53 -w /tmp/tmp.OzD50otnIU
Jan 26 09:03:21 compute-1 sudo[221832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:03:21 compute-1 systemd-logind[788]: Session 66 logged out. Waiting for processes to exit.
Jan 26 09:03:21 compute-1 systemd[1]: session-66.scope: Deactivated successfully.
Jan 26 09:03:21 compute-1 systemd-logind[788]: Removed session 66.
Jan 26 09:03:21 compute-1 nova_compute[183083]: 2026-01-26 09:03:21.472 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:03:21 compute-1 nova_compute[183083]: 2026-01-26 09:03:21.474 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13542MB free_disk=113.06499862670898GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:03:21 compute-1 nova_compute[183083]: 2026-01-26 09:03:21.474 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:03:21 compute-1 nova_compute[183083]: 2026-01-26 09:03:21.474 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:03:21 compute-1 nova_compute[183083]: 2026-01-26 09:03:21.658 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance d11cfdba-71df-4d49-a60f-29397352c308 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 09:03:21 compute-1 nova_compute[183083]: 2026-01-26 09:03:21.658 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:03:21 compute-1 nova_compute[183083]: 2026-01-26 09:03:21.659 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:03:21 compute-1 nova_compute[183083]: 2026-01-26 09:03:21.749 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:03:21 compute-1 nova_compute[183083]: 2026-01-26 09:03:21.776 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:03:21 compute-1 nova_compute[183083]: 2026-01-26 09:03:21.805 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:03:21 compute-1 nova_compute[183083]: 2026-01-26 09:03:21.806 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.332s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:03:22 compute-1 nova_compute[183083]: 2026-01-26 09:03:22.634 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:25 compute-1 nova_compute[183083]: 2026-01-26 09:03:25.855 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:27 compute-1 nova_compute[183083]: 2026-01-26 09:03:27.638 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:30 compute-1 sshd-session[221858]: Accepted publickey for zuul from 38.102.83.66 port 52502 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:03:30 compute-1 systemd-logind[788]: New session 67 of user zuul.
Jan 26 09:03:30 compute-1 systemd[1]: Started Session 67 of User zuul.
Jan 26 09:03:30 compute-1 sshd-session[221858]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:03:30 compute-1 sudo[221862]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.OzD50otnIU
Jan 26 09:03:30 compute-1 sudo[221862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:03:30 compute-1 sudo[221862]: pam_unix(sudo:session): session closed for user root
Jan 26 09:03:30 compute-1 nova_compute[183083]: 2026-01-26 09:03:30.889 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:31 compute-1 sshd-session[221888]: Accepted publickey for zuul from 38.102.83.66 port 52516 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:03:31 compute-1 systemd-logind[788]: New session 68 of user zuul.
Jan 26 09:03:31 compute-1 systemd[1]: Started Session 68 of User zuul.
Jan 26 09:03:31 compute-1 sshd-session[221888]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:03:31 compute-1 sshd-session[221892]: Accepted publickey for zuul from 38.102.83.66 port 52528 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:03:31 compute-1 systemd-logind[788]: New session 69 of user zuul.
Jan 26 09:03:31 compute-1 systemd[1]: Started Session 69 of User zuul.
Jan 26 09:03:31 compute-1 sshd-session[221892]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:03:31 compute-1 sudo[221896]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:03:31 compute-1 sudo[221896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:03:31 compute-1 sudo[221896]: pam_unix(sudo:session): session closed for user root
Jan 26 09:03:32 compute-1 sudo[221921]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni genev_sys_6081 icmp and ether host fa:16:3e:e0:2a:17 and ether host fa:16:3e:b0:5c:01 -w /tmp/tmp.f0ETVFuXUO
Jan 26 09:03:32 compute-1 sudo[221921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:03:32 compute-1 sshd-session[221895]: Connection closed by 38.102.83.66 port 52528
Jan 26 09:03:32 compute-1 sshd-session[221892]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:03:32 compute-1 systemd[1]: session-69.scope: Deactivated successfully.
Jan 26 09:03:32 compute-1 systemd-logind[788]: Session 69 logged out. Waiting for processes to exit.
Jan 26 09:03:32 compute-1 systemd-logind[788]: Removed session 69.
Jan 26 09:03:32 compute-1 nova_compute[183083]: 2026-01-26 09:03:32.641 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:33 compute-1 podman[221947]: 2026-01-26 09:03:33.825462497 +0000 UTC m=+0.082664345 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 09:03:35 compute-1 nova_compute[183083]: 2026-01-26 09:03:35.928 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:37 compute-1 nova_compute[183083]: 2026-01-26 09:03:37.645 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:38 compute-1 ovn_controller[95352]: 2026-01-26T09:03:38Z|00270|memory_trim|INFO|Detected inactivity (last active 30019 ms ago): trimming memory
Jan 26 09:03:40 compute-1 sshd-session[221973]: Accepted publickey for zuul from 38.102.83.66 port 33264 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:03:40 compute-1 systemd-logind[788]: New session 70 of user zuul.
Jan 26 09:03:40 compute-1 systemd[1]: Started Session 70 of User zuul.
Jan 26 09:03:40 compute-1 sshd-session[221973]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:03:40 compute-1 nova_compute[183083]: 2026-01-26 09:03:40.931 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:40 compute-1 sudo[221977]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.f0ETVFuXUO
Jan 26 09:03:40 compute-1 sudo[221977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:03:41 compute-1 sudo[221977]: pam_unix(sudo:session): session closed for user root
Jan 26 09:03:42 compute-1 nova_compute[183083]: 2026-01-26 09:03:42.720 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:45 compute-1 nova_compute[183083]: 2026-01-26 09:03:45.732 183087 DEBUG nova.virt.libvirt.driver [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Creating tmpfile /var/lib/nova/instances/tmpu2ru3cb0 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Jan 26 09:03:45 compute-1 nova_compute[183083]: 2026-01-26 09:03:45.834 183087 DEBUG nova.compute.manager [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=114688,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu2ru3cb0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Jan 26 09:03:45 compute-1 nova_compute[183083]: 2026-01-26 09:03:45.888 183087 DEBUG oslo_concurrency.lockutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:03:45 compute-1 nova_compute[183083]: 2026-01-26 09:03:45.889 183087 DEBUG oslo_concurrency.lockutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:03:45 compute-1 nova_compute[183083]: 2026-01-26 09:03:45.918 183087 INFO nova.compute.rpcapi [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Jan 26 09:03:45 compute-1 nova_compute[183083]: 2026-01-26 09:03:45.919 183087 DEBUG oslo_concurrency.lockutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:03:45 compute-1 nova_compute[183083]: 2026-01-26 09:03:45.935 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:47 compute-1 nova_compute[183083]: 2026-01-26 09:03:47.723 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:47 compute-1 nova_compute[183083]: 2026-01-26 09:03:47.849 183087 DEBUG nova.compute.manager [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=114688,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu2ru3cb0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bcd4a434-a1cf-402b-87c3-8d39bc284a82',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Jan 26 09:03:47 compute-1 nova_compute[183083]: 2026-01-26 09:03:47.927 183087 DEBUG oslo_concurrency.lockutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "refresh_cache-bcd4a434-a1cf-402b-87c3-8d39bc284a82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:03:47 compute-1 nova_compute[183083]: 2026-01-26 09:03:47.927 183087 DEBUG oslo_concurrency.lockutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquired lock "refresh_cache-bcd4a434-a1cf-402b-87c3-8d39bc284a82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:03:47 compute-1 nova_compute[183083]: 2026-01-26 09:03:47.928 183087 DEBUG nova.network.neutron [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 09:03:48 compute-1 podman[222003]: 2026-01-26 09:03:48.824269587 +0000 UTC m=+0.090081794 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 26 09:03:48 compute-1 podman[222004]: 2026-01-26 09:03:48.832697915 +0000 UTC m=+0.090207217 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, config_id=openstack_network_exporter, distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 26 09:03:50 compute-1 podman[222047]: 2026-01-26 09:03:50.830406417 +0000 UTC m=+0.071072836 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 26 09:03:50 compute-1 podman[222048]: 2026-01-26 09:03:50.845240956 +0000 UTC m=+0.082866619 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:03:50 compute-1 podman[222045]: 2026-01-26 09:03:50.883337742 +0000 UTC m=+0.131788091 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 09:03:50 compute-1 nova_compute[183083]: 2026-01-26 09:03:50.937 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:51 compute-1 sshd-session[222044]: Invalid user solv from 2.57.122.238 port 58610
Jan 26 09:03:51 compute-1 sshd-session[222044]: Connection closed by invalid user solv 2.57.122.238 port 58610 [preauth]
Jan 26 09:03:52 compute-1 nova_compute[183083]: 2026-01-26 09:03:52.749 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:52 compute-1 nova_compute[183083]: 2026-01-26 09:03:52.936 183087 DEBUG nova.network.neutron [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Updating instance_info_cache with network_info: [{"id": "896fca22-d4bb-4060-89c0-72ac1b8f6dd7", "address": "fa:16:3e:e0:2a:17", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap896fca22-d4", "ovs_interfaceid": "896fca22-d4bb-4060-89c0-72ac1b8f6dd7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:03:52 compute-1 nova_compute[183083]: 2026-01-26 09:03:52.955 183087 DEBUG oslo_concurrency.lockutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Releasing lock "refresh_cache-bcd4a434-a1cf-402b-87c3-8d39bc284a82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:03:52 compute-1 nova_compute[183083]: 2026-01-26 09:03:52.957 183087 DEBUG nova.virt.libvirt.driver [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=114688,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu2ru3cb0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bcd4a434-a1cf-402b-87c3-8d39bc284a82',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Jan 26 09:03:52 compute-1 nova_compute[183083]: 2026-01-26 09:03:52.957 183087 DEBUG nova.virt.libvirt.driver [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Creating instance directory: /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Jan 26 09:03:52 compute-1 nova_compute[183083]: 2026-01-26 09:03:52.958 183087 DEBUG nova.virt.libvirt.driver [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Creating disk.info with the contents: {'/var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk': 'qcow2', '/var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Jan 26 09:03:52 compute-1 nova_compute[183083]: 2026-01-26 09:03:52.958 183087 DEBUG nova.virt.libvirt.driver [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Jan 26 09:03:52 compute-1 nova_compute[183083]: 2026-01-26 09:03:52.959 183087 DEBUG nova.objects.instance [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lazy-loading 'trusted_certs' on Instance uuid bcd4a434-a1cf-402b-87c3-8d39bc284a82 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:03:52 compute-1 nova_compute[183083]: 2026-01-26 09:03:52.980 183087 DEBUG oslo_concurrency.processutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.056 183087 DEBUG oslo_concurrency.processutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.057 183087 DEBUG oslo_concurrency.lockutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.058 183087 DEBUG oslo_concurrency.lockutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.069 183087 DEBUG oslo_concurrency.processutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.139 183087 DEBUG oslo_concurrency.processutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.142 183087 DEBUG oslo_concurrency.processutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.189 183087 DEBUG oslo_concurrency.processutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.190 183087 DEBUG oslo_concurrency.lockutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.190 183087 DEBUG oslo_concurrency.processutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.244 183087 DEBUG oslo_concurrency.processutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.246 183087 DEBUG nova.virt.disk.api [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Checking if we can resize image /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.246 183087 DEBUG oslo_concurrency.processutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.304 183087 DEBUG oslo_concurrency.processutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.306 183087 DEBUG nova.virt.disk.api [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Cannot resize image /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.307 183087 DEBUG nova.objects.instance [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lazy-loading 'migration_context' on Instance uuid bcd4a434-a1cf-402b-87c3-8d39bc284a82 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.328 183087 DEBUG oslo_concurrency.processutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.363 183087 DEBUG oslo_concurrency.processutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.config 485376" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.364 183087 DEBUG nova.virt.libvirt.volume.remotefs [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.config to /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.365 183087 DEBUG oslo_concurrency.processutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.config /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.856 183087 DEBUG oslo_concurrency.processutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.config /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.857 183087 DEBUG nova.virt.libvirt.driver [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.858 183087 DEBUG nova.virt.libvirt.vif [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T09:02:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-1081491655',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1081491655',id=47,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCVUzFf6wSdyelY/UeY7cqTv8O0no1sfyU2c1QC2Iq0A42faUChCXJr9C1AIPerWETKRC47CWhTspqdq2Y/jF5MXoBNhFYSHEELzhDImF6CTLJqQZ+4txACe/VixshXUeg==',key_name='tempest-keypair-47432938',keypairs=<?>,launch_index=0,launched_at=2026-01-26T09:02:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2580bb16c90849c4b5919eb271774a06',ramdisk_id='',reservation_id='r-x2br1yxl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-691788706',owner_user_name='tempest-OvnDvrTest-691788706-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:02:47Z,user_data=None,user_id='90104736f4ab4d81b09d1ff11e40f454',uuid=bcd4a434-a1cf-402b-87c3-8d39bc284a82,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "896fca22-d4bb-4060-89c0-72ac1b8f6dd7", "address": "fa:16:3e:e0:2a:17", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap896fca22-d4", "ovs_interfaceid": "896fca22-d4bb-4060-89c0-72ac1b8f6dd7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.859 183087 DEBUG nova.network.os_vif_util [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converting VIF {"id": "896fca22-d4bb-4060-89c0-72ac1b8f6dd7", "address": "fa:16:3e:e0:2a:17", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap896fca22-d4", "ovs_interfaceid": "896fca22-d4bb-4060-89c0-72ac1b8f6dd7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.860 183087 DEBUG nova.network.os_vif_util [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e0:2a:17,bridge_name='br-int',has_traffic_filtering=True,id=896fca22-d4bb-4060-89c0-72ac1b8f6dd7,network=Network(a7990128-24d3-4373-9624-cea49e6db86a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap896fca22-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.860 183087 DEBUG os_vif [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e0:2a:17,bridge_name='br-int',has_traffic_filtering=True,id=896fca22-d4bb-4060-89c0-72ac1b8f6dd7,network=Network(a7990128-24d3-4373-9624-cea49e6db86a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap896fca22-d4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.861 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.862 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.863 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.866 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.866 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap896fca22-d4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.866 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap896fca22-d4, col_values=(('external_ids', {'iface-id': '896fca22-d4bb-4060-89c0-72ac1b8f6dd7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e0:2a:17', 'vm-uuid': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.881 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:53 compute-1 NetworkManager[55451]: <info>  [1769418233.8832] manager: (tap896fca22-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.885 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.890 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.893 183087 INFO os_vif [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e0:2a:17,bridge_name='br-int',has_traffic_filtering=True,id=896fca22-d4bb-4060-89c0-72ac1b8f6dd7,network=Network(a7990128-24d3-4373-9624-cea49e6db86a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap896fca22-d4')
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.894 183087 DEBUG nova.virt.libvirt.driver [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Jan 26 09:03:53 compute-1 nova_compute[183083]: 2026-01-26 09:03:53.895 183087 DEBUG nova.compute.manager [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=114688,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu2ru3cb0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bcd4a434-a1cf-402b-87c3-8d39bc284a82',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Jan 26 09:03:54 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:54.922 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:03:54 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:54.923 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:03:54 compute-1 nova_compute[183083]: 2026-01-26 09:03:54.954 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:55 compute-1 nova_compute[183083]: 2026-01-26 09:03:55.270 183087 DEBUG nova.network.neutron [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Port 896fca22-d4bb-4060-89c0-72ac1b8f6dd7 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Jan 26 09:03:55 compute-1 nova_compute[183083]: 2026-01-26 09:03:55.272 183087 DEBUG nova.compute.manager [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=114688,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu2ru3cb0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bcd4a434-a1cf-402b-87c3-8d39bc284a82',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Jan 26 09:03:55 compute-1 systemd[1]: Starting libvirt proxy daemon...
Jan 26 09:03:55 compute-1 systemd[1]: Started libvirt proxy daemon.
Jan 26 09:03:55 compute-1 kernel: tap896fca22-d4: entered promiscuous mode
Jan 26 09:03:55 compute-1 NetworkManager[55451]: <info>  [1769418235.6140] manager: (tap896fca22-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Jan 26 09:03:55 compute-1 ovn_controller[95352]: 2026-01-26T09:03:55Z|00271|binding|INFO|Claiming lport 896fca22-d4bb-4060-89c0-72ac1b8f6dd7 for this additional chassis.
Jan 26 09:03:55 compute-1 ovn_controller[95352]: 2026-01-26T09:03:55Z|00272|binding|INFO|896fca22-d4bb-4060-89c0-72ac1b8f6dd7: Claiming fa:16:3e:e0:2a:17 10.100.0.21
Jan 26 09:03:55 compute-1 nova_compute[183083]: 2026-01-26 09:03:55.615 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:55 compute-1 nova_compute[183083]: 2026-01-26 09:03:55.629 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:55 compute-1 ovn_controller[95352]: 2026-01-26T09:03:55Z|00273|binding|INFO|Setting lport 896fca22-d4bb-4060-89c0-72ac1b8f6dd7 ovn-installed in OVS
Jan 26 09:03:55 compute-1 nova_compute[183083]: 2026-01-26 09:03:55.631 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:55 compute-1 nova_compute[183083]: 2026-01-26 09:03:55.633 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:55 compute-1 systemd-udevd[222169]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 09:03:55 compute-1 systemd-machined[154360]: New machine qemu-17-instance-0000002f.
Jan 26 09:03:55 compute-1 NetworkManager[55451]: <info>  [1769418235.6774] device (tap896fca22-d4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 09:03:55 compute-1 systemd[1]: Started Virtual Machine qemu-17-instance-0000002f.
Jan 26 09:03:55 compute-1 NetworkManager[55451]: <info>  [1769418235.6785] device (tap896fca22-d4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 09:03:56 compute-1 nova_compute[183083]: 2026-01-26 09:03:56.317 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769418236.3165524, bcd4a434-a1cf-402b-87c3-8d39bc284a82 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:03:56 compute-1 nova_compute[183083]: 2026-01-26 09:03:56.318 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] VM Started (Lifecycle Event)
Jan 26 09:03:56 compute-1 nova_compute[183083]: 2026-01-26 09:03:56.340 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:03:57 compute-1 nova_compute[183083]: 2026-01-26 09:03:57.181 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769418237.1806538, bcd4a434-a1cf-402b-87c3-8d39bc284a82 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:03:57 compute-1 nova_compute[183083]: 2026-01-26 09:03:57.181 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] VM Resumed (Lifecycle Event)
Jan 26 09:03:57 compute-1 nova_compute[183083]: 2026-01-26 09:03:57.205 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:03:57 compute-1 nova_compute[183083]: 2026-01-26 09:03:57.208 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 09:03:57 compute-1 nova_compute[183083]: 2026-01-26 09:03:57.241 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Jan 26 09:03:57 compute-1 nova_compute[183083]: 2026-01-26 09:03:57.752 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:58 compute-1 nova_compute[183083]: 2026-01-26 09:03:58.924 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:03:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:03:59.925 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:04:00 compute-1 ovn_controller[95352]: 2026-01-26T09:04:00Z|00274|binding|INFO|Claiming lport 896fca22-d4bb-4060-89c0-72ac1b8f6dd7 for this chassis.
Jan 26 09:04:00 compute-1 ovn_controller[95352]: 2026-01-26T09:04:00Z|00275|binding|INFO|896fca22-d4bb-4060-89c0-72ac1b8f6dd7: Claiming fa:16:3e:e0:2a:17 10.100.0.21
Jan 26 09:04:00 compute-1 ovn_controller[95352]: 2026-01-26T09:04:00Z|00276|binding|INFO|Setting lport 896fca22-d4bb-4060-89c0-72ac1b8f6dd7 up in Southbound
Jan 26 09:04:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:00.275 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:2a:17 10.100.0.21'], port_security=['fa:16:3e:e0:2a:17 10.100.0.21'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.21/28', 'neutron:device_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a7990128-24d3-4373-9624-cea49e6db86a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2580bb16c90849c4b5919eb271774a06', 'neutron:revision_number': '11', 'neutron:security_group_ids': '30ca606d-e02a-4090-8787-8ade3dd6e02d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.178'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6e8b8dd-7a7a-42f2-93d9-90eaac5d38d1, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=896fca22-d4bb-4060-89c0-72ac1b8f6dd7) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:04:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:00.278 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 896fca22-d4bb-4060-89c0-72ac1b8f6dd7 in datapath a7990128-24d3-4373-9624-cea49e6db86a bound to our chassis
Jan 26 09:04:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:00.281 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a7990128-24d3-4373-9624-cea49e6db86a
Jan 26 09:04:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:00.302 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[a5327618-e977-4f66-b70b-b788e48a885a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:04:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:00.354 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[a78ce2db-f601-4f47-bbd8-e63ea8d3e7ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:04:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:00.359 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[2cb619cb-2a3c-4055-a403-b4a02b60f36e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:04:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:00.405 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[5683414c-737a-4e27-919f-c02c6e4f9dbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:04:00 compute-1 nova_compute[183083]: 2026-01-26 09:04:00.409 183087 INFO nova.compute.manager [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Post operation of migration started
Jan 26 09:04:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:00.434 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[c546bba8-2df5-4705-bc1a-fc54c27fed93]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa7990128-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d3:fd:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 6, 'rx_bytes': 826, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 6, 'rx_bytes': 826, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452379, 'reachable_time': 20812, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222205, 'error': None, 'target': 'ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:04:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:00.458 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[e6e76a63-5767-4757-8b37-96a04715efe3]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa7990128-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 452390, 'tstamp': 452390}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222206, 'error': None, 'target': 'ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.18'], ['IFA_LOCAL', '10.100.0.18'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapa7990128-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 452393, 'tstamp': 452393}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222206, 'error': None, 'target': 'ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:04:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:00.460 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa7990128-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:04:00 compute-1 nova_compute[183083]: 2026-01-26 09:04:00.463 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:00.464 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa7990128-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:04:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:00.465 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:04:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:00.465 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa7990128-20, col_values=(('external_ids', {'iface-id': 'e4a9eb60-6fc2-4712-a497-5aead39b5136'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:04:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:00.466 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:04:00 compute-1 nova_compute[183083]: 2026-01-26 09:04:00.842 183087 DEBUG oslo_concurrency.lockutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "refresh_cache-bcd4a434-a1cf-402b-87c3-8d39bc284a82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:04:00 compute-1 nova_compute[183083]: 2026-01-26 09:04:00.842 183087 DEBUG oslo_concurrency.lockutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquired lock "refresh_cache-bcd4a434-a1cf-402b-87c3-8d39bc284a82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:04:00 compute-1 nova_compute[183083]: 2026-01-26 09:04:00.843 183087 DEBUG nova.network.neutron [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 09:04:02 compute-1 nova_compute[183083]: 2026-01-26 09:04:02.756 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.747 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'name': 'tempest-server-test-1081491655', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002f', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2580bb16c90849c4b5919eb271774a06', 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'hostId': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.751 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'name': 'tempest-server-test-697225216', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000030', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2580bb16c90849c4b5919eb271774a06', 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'hostId': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.751 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.755 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for bcd4a434-a1cf-402b-87c3-8d39bc284a82 / tap896fca22-d4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.755 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.760 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d11cfdba-71df-4d49-a60f-29397352c308 / tap73d02031-13 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.760 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b1857d3-675c-4183-bfe0-26ee94ab3ce5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002f-bcd4a434-a1cf-402b-87c3-8d39bc284a82-tap896fca22-d4', 'timestamp': '2026-01-26T09:04:03.752155', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'tap896fca22-d4', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:2a:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap896fca22-d4'}, 'message_id': 'f64969ba-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.414159324, 'message_signature': '94f5854ea39be60b9527e3cbd86475e1fbf1efec17c3e9b1ad9e07730eac5206'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-00000030-d11cfdba-71df-4d49-a60f-29397352c308-tap73d02031-13', 'timestamp': '2026-01-26T09:04:03.752155', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'tap73d02031-13', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5c:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap73d02031-13'}, 'message_id': 'f64a1964-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.418537596, 'message_signature': '717a96c03297bc6a7bf4b5cf37a5311826251d2899ca355f7079f5aeffe7cd7f'}]}, 'timestamp': '2026-01-26 09:04:03.761044', '_unique_id': '4156ff81234e48a6b169b7d44af7bab9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.762 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.764 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.781 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.782 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.800 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.801 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f7a3e7a-6071-4bea-b277-f51fce9ae2fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82-vda', 'timestamp': '2026-01-26T09:04:03.764689', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f64d5840-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.426691074, 'message_signature': 'f9e9b6d5026d8d728f7c8ad79afb151741567c3f6c8cb0f08e6f268eaf2db5a5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82-sda', 'timestamp': '2026-01-26T09:04:03.764689', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f64d6f42-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.426691074, 'message_signature': 'c56161b2678ac6f16619db9114e8c69406e5f27aeee254d1476b6d49f6c6858b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308-vda', 'timestamp': '2026-01-26T09:04:03.764689', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6503ac4-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.44482278, 'message_signature': '70d6ce5d566330b57969f04497645aad59a67b8ebf1b1cbb2f028500ed9c867f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308-sda', 'timestamp': '2026-01-26T09:04:03.764689', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6504e42-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.44482278, 'message_signature': '3df39b058cf0c00095b6ee5abb959d0ff02aca41233a8a7657b5f0a0fc562770'}]}, 'timestamp': '2026-01-26 09:04:03.801673', '_unique_id': '7c7aada9b3684e3fad62287d513d2c37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.803 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.804 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.828 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/cpu volume: 40000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.856 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/cpu volume: 10630000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d108dc3-b387-4a89-afe6-730d9e57cc16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 40000000, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'timestamp': '2026-01-26T09:04:03.804732', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f6549074-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.490818395, 'message_signature': '391941eee955990652e45c864be704387bc758d5030292fb946609861c5108f2'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10630000000, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'timestamp': '2026-01-26T09:04:03.804732', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f658b74e-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.518173949, 'message_signature': 'e22cfec873f9c02d89474d7cb1622067bd6ac05e2c1b21bc2ef789c2daffa82a'}]}, 'timestamp': '2026-01-26 09:04:03.856810', '_unique_id': 'f7b13e90c673420f9a2d3060599bc057'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.858 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.859 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.859 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.860 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1081491655>, <NovaLikeServer: tempest-server-test-697225216>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1081491655>, <NovaLikeServer: tempest-server-test-697225216>]
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.860 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.860 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.device.allocation volume: 30412800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.861 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.861 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.861 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63ba3da0-9eec-4829-8c95-599313c14d71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30412800, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82-vda', 'timestamp': '2026-01-26T09:04:03.860533', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6595c12-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.426691074, 'message_signature': '098484951b9cbd75a098092878ec8eb123e38d5fd865da80d41472b09467e3bf'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82-sda', 'timestamp': '2026-01-26T09:04:03.860533', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6596ec8-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.426691074, 'message_signature': '20645d54df705c15192f0f6ae4ae6369acc6355f208ad435c4db3244732d95ac'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308-vda', 'timestamp': '2026-01-26T09:04:03.860533', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6597f30-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.44482278, 'message_signature': 'dcb93e398402af2fc74f1a9eb99ee8b1dde7720ee1365fdab9e1669ff1ab6b53'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308-sda', 'timestamp': '2026-01-26T09:04:03.860533', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f65990d8-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.44482278, 'message_signature': '954f156ef62e1f4a2ae70f852d87424ecb9bea33cda1e6475c976f6110e8dced'}]}, 'timestamp': '2026-01-26 09:04:03.862347', '_unique_id': '2c9f428f8585436aa74316616863457c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.863 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.864 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.864 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.865 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1081491655>, <NovaLikeServer: tempest-server-test-697225216>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1081491655>, <NovaLikeServer: tempest-server-test-697225216>]
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.865 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.865 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.865 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee55a2b3-5821-4778-9085-a0888532fdde', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002f-bcd4a434-a1cf-402b-87c3-8d39bc284a82-tap896fca22-d4', 'timestamp': '2026-01-26T09:04:03.865450', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'tap896fca22-d4', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:2a:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap896fca22-d4'}, 'message_id': 'f65a1c2e-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.414159324, 'message_signature': '7c8a8f7270c782bb7f2efb886afb6c325c629896802c29b7c788cb112514edfc'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-00000030-d11cfdba-71df-4d49-a60f-29397352c308-tap73d02031-13', 'timestamp': '2026-01-26T09:04:03.865450', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'tap73d02031-13', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5c:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap73d02031-13'}, 'message_id': 'f65a30ba-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.418537596, 'message_signature': '9dbd5fd6c613cc6a4941c83b91af0a708e9ee62bff2800b07cf645a90d8f3e82'}]}, 'timestamp': '2026-01-26 09:04:03.866456', '_unique_id': 'f9a4d1ef7a8544feb4d807c740aae304'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.867 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.868 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.869 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/network.outgoing.packets volume: 20 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.869 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/network.outgoing.packets volume: 43 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7966d7f-16b7-426b-809b-6d7081fdec09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 20, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002f-bcd4a434-a1cf-402b-87c3-8d39bc284a82-tap896fca22-d4', 'timestamp': '2026-01-26T09:04:03.869002', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'tap896fca22-d4', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:2a:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap896fca22-d4'}, 'message_id': 'f65aa87e-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.414159324, 'message_signature': 'b89cb5d4dbacc46a659cfda5e4f5eafc078f3a9e4e2a321e8a5ea8aa3a7f9bff'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 43, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-00000030-d11cfdba-71df-4d49-a60f-29397352c308-tap73d02031-13', 'timestamp': '2026-01-26T09:04:03.869002', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'tap73d02031-13', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5c:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap73d02031-13'}, 'message_id': 'f65aba8a-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.418537596, 'message_signature': '40a10e61b42d6185ef46faed8ba9ee12711fd07d7949be0ce01beb414acb580b'}]}, 'timestamp': '2026-01-26 09:04:03.869982', '_unique_id': 'd0bdc5ed93e54e6b824d4a3a1c3e80bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.871 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.872 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.918 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:03.919 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:03 compute-1 ovn_controller[95352]: 2026-01-26T09:04:03Z|00277|pinctrl|WARN|Dropped 319 log messages in last 59 seconds (most recently, 3 seconds ago) due to excessive rate
Jan 26 09:04:03 compute-1 ovn_controller[95352]: 2026-01-26T09:04:03Z|00278|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:04:03 compute-1 nova_compute[183083]: 2026-01-26 09:04:03.971 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.013 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/disk.device.read.bytes volume: 29477376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.014 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35431a64-e9fc-41fe-9dbd-e882e67eac23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82-vda', 'timestamp': '2026-01-26T09:04:03.872428', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6623756-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.534436873, 'message_signature': '74aa25e16d461e06b6c4ecd579feab0005076645b27a8a91040d52c390ec2eac'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82-sda', 'timestamp': '2026-01-26T09:04:03.872428', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6624e76-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.534436873, 'message_signature': '0d671ae95e2c529ae7cccc978c5e05060549a31111c82d8e57a8a955c957557b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29477376, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308-vda', 'timestamp': '2026-01-26T09:04:03.872428', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f670b100-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.581641641, 'message_signature': 'd4a549fc8880aa7c2b9fa12eca7b21e2df1c03360915b4bcdb61d2b0dec54c06'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308-sda', 'timestamp': '2026-01-26T09:04:03.872428', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f670d054-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.581641641, 'message_signature': '46950c8437c04905dca03892b9f8e0cab3615d8a5ea594df662d5d84bb6d6b47'}]}, 'timestamp': '2026-01-26 09:04:04.014808', '_unique_id': 'b01b413b2bac45ed9e15d5f2e6ee2cd9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.016 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.017 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.018 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.018 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.019 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.019 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce8a6e37-5732-4040-97fd-86751b23b97e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82-vda', 'timestamp': '2026-01-26T09:04:04.018040', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6716ac8-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.426691074, 'message_signature': 'f9f5e6a329d47cba831adf72b456273ea9265813dc15575ae7925c98e5d3f6b8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82-sda', 'timestamp': '2026-01-26T09:04:04.018040', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f67181fc-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.426691074, 'message_signature': '56b11c7770ef7cd9837f16d0ad83e3fdc4b515514163710846dc2537ba9daef6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308-vda', 'timestamp': '2026-01-26T09:04:04.018040', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6719660-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.44482278, 'message_signature': '83fbb42bdf67b740e1aaa1aa85360045bdde476555cf0834c52bea06ec2d5016'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308-sda', 'timestamp': '2026-01-26T09:04:04.018040', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f671a9de-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.44482278, 'message_signature': '631a5883fbe2fa3f53eb697b8837873203daf146132f070b12264405c1629688'}]}, 'timestamp': '2026-01-26 09:04:04.020356', '_unique_id': 'e259220bcff14cdea0c1105d53867788'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.021 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.022 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.023 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.023 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1081491655>, <NovaLikeServer: tempest-server-test-697225216>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1081491655>, <NovaLikeServer: tempest-server-test-697225216>]
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.023 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.023 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/network.incoming.bytes volume: 616 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.024 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/network.incoming.bytes volume: 5487 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79347a46-a97f-4296-a065-4bafa008bc6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 616, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002f-bcd4a434-a1cf-402b-87c3-8d39bc284a82-tap896fca22-d4', 'timestamp': '2026-01-26T09:04:04.023855', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'tap896fca22-d4', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:2a:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap896fca22-d4'}, 'message_id': 'f6724c22-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.414159324, 'message_signature': 'f172da63bffab03839794340d7ce7bbfa72eba4ee8f1fcfc3a027836d3437d88'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5487, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-00000030-d11cfdba-71df-4d49-a60f-29397352c308-tap73d02031-13', 'timestamp': '2026-01-26T09:04:04.023855', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'tap73d02031-13', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5c:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap73d02031-13'}, 'message_id': 'f6725faa-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.418537596, 'message_signature': 'a433ad764a202c4f54ae6945962a1e2ee7cab8b9c9b56c0313bedb49d93ed69d'}]}, 'timestamp': '2026-01-26 09:04:04.024958', '_unique_id': '502f481a7f5c42018cbcb2464167e73b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.026 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.027 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.device.write.latency volume: 10669752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.027 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.028 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/disk.device.write.latency volume: 4995937130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.028 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3090f92b-db88-4def-a72b-a32463d2dc54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10669752, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82-vda', 'timestamp': '2026-01-26T09:04:04.027512', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f672d5de-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.534436873, 'message_signature': '854c636b13ee39ee34782a8f53117be83179b97386d45032b15e26743737d666'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82-sda', 'timestamp': '2026-01-26T09:04:04.027512', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f672e7e0-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.534436873, 'message_signature': 'f137dbcaa9500df094954612e51dde53549cf8096326ca54f9aef361edfc3517'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4995937130, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308-vda', 'timestamp': '2026-01-26T09:04:04.027512', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f672f7b2-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.581641641, 'message_signature': '6f84978b2b3da90882d0311bccec3a84c08e99a31e90932dc0f4cafc5908d0db'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308-sda', 'timestamp': '2026-01-26T09:04:04.027512', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f67306d0-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.581641641, 'message_signature': '4ea72afeba75c09f8a55883eeca3012fbe95c76bb1152b9b7e57f87746f4010e'}]}, 'timestamp': '2026-01-26 09:04:04.029226', '_unique_id': '40d7713553be4d36a2ca1ef356462bee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.030 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.031 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.031 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.032 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.032 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/disk.device.read.latency volume: 181371229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.033 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/disk.device.read.latency volume: 20932262 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1ca1dc0-9fe3-4813-9bea-64231ed5c203', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82-vda', 'timestamp': '2026-01-26T09:04:04.031723', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6737a52-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.534436873, 'message_signature': 'ac43df199248e0addef512ab3ce6fcdab03ac35c6bd5cbaab00b6f7e4738d9b1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82-sda', 'timestamp': '2026-01-26T09:04:04.031723', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6738c68-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.534436873, 'message_signature': '2e3a5e968cc3769a0b74a7af2b9e4eb6f5528434f07522f21f01dc5161057335'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 181371229, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308-vda', 'timestamp': '2026-01-26T09:04:04.031723', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6739be0-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.581641641, 'message_signature': '05f1a46ce8d35dd45d2ec92e2fe50a0ae70ed5ff9bf96569cd47c8b412b91923'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20932262, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308-sda', 'timestamp': '2026-01-26T09:04:04.031723', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f673ac52-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.581641641, 'message_signature': '8bba7d38f772fc914a9749ed18386e583aaa2285883278b6930c58bf57648d61'}]}, 'timestamp': '2026-01-26 09:04:04.033432', '_unique_id': '17117e13b82f4cdbbe2da29620f06e60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.034 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.035 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.035 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.036 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.036 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/disk.device.read.requests volume: 1070 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.037 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b1cd5dd-1193-4041-abd4-cbfcb7dc900e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82-vda', 'timestamp': '2026-01-26T09:04:04.035895', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6741e76-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.534436873, 'message_signature': 'b0bd4f05477a6b2b6477113d78ad29a29218c3c857490b32261758ef9cea43d2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82-sda', 'timestamp': '2026-01-26T09:04:04.035895', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6742efc-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.534436873, 'message_signature': 'be594db9cebc2284947d1943b40048f26acacf1da0bdc07f898fa31fcaa2a181'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1070, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308-vda', 'timestamp': '2026-01-26T09:04:04.035895', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6743e6a-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.581641641, 'message_signature': '44f74d0d9b78caf6be92d83992efec380753c42bb47554ef0f8e63328511b431'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308-sda', 'timestamp': '2026-01-26T09:04:04.035895', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6745288-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.581641641, 'message_signature': '93619baec5b9cb4efc7a9c68cf77ff53a63c4fe0d4d8fc3232f9c83a7139bc8a'}]}, 'timestamp': '2026-01-26 09:04:04.037700', '_unique_id': 'daa5179bffa4442eb196b242ca617c38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.038 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.040 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.040 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.040 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1081491655>, <NovaLikeServer: tempest-server-test-697225216>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1081491655>, <NovaLikeServer: tempest-server-test-697225216>]
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.040 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.040 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.041 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7140cc9b-7cc4-4f25-a1ca-084b7877cad2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002f-bcd4a434-a1cf-402b-87c3-8d39bc284a82-tap896fca22-d4', 'timestamp': '2026-01-26T09:04:04.040949', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'tap896fca22-d4', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:2a:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap896fca22-d4'}, 'message_id': 'f674e43c-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.414159324, 'message_signature': '3e0f14debeb552ceb8b2530794b7f8e21f6844784c97929d1e7e3a3318c13706'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-00000030-d11cfdba-71df-4d49-a60f-29397352c308-tap73d02031-13', 'timestamp': '2026-01-26T09:04:04.040949', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'tap73d02031-13', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5c:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap73d02031-13'}, 'message_id': 'f674f5c6-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.418537596, 'message_signature': '7137a189dde7ed660417ed77d9f3eefd4541c53d2b7eab69a5008b94a7867008'}]}, 'timestamp': '2026-01-26 09:04:04.041888', '_unique_id': '962e8835f8a14b038c3f19ea9d04afe5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.042 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.044 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.044 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/memory.usage volume: 42.76171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.044 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/memory.usage volume: 42.6640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be70ab11-782a-48ac-af42-7d300c90dcdc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.76171875, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'timestamp': '2026-01-26T09:04:04.044260', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'f67563ee-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.490818395, 'message_signature': '654f5f610634530d73134ae094dc08b4202894a3e483cafdfe1c56b5a598e2ee'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.6640625, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'timestamp': '2026-01-26T09:04:04.044260', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'f6757438-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.518173949, 'message_signature': 'd93faf19050b6c318ba7e0ce1058f6234339614e0a0d1869419f99a15b7c32ba'}]}, 'timestamp': '2026-01-26 09:04:04.045152', '_unique_id': 'f43e4c1357fb40cab56c4c501bd71ded'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.046 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.047 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.047 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.device.write.requests volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.047 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.048 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/disk.device.write.requests volume: 321 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.048 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80c10217-e3e4-4f76-83ff-a645b3bdbab5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 14, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82-vda', 'timestamp': '2026-01-26T09:04:04.047490', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f675e1f2-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.534436873, 'message_signature': '0ccf03be19a31280e440f51a21d2bb6e688383d61caba87517b397993e253db4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82-sda', 'timestamp': '2026-01-26T09:04:04.047490', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f675f354-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.534436873, 'message_signature': '4af1b452c9fbfbcdc015a8214f0c122e12d9d3ab820bb9e041b40acdc5fd21f6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 321, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308-vda', 'timestamp': '2026-01-26T09:04:04.047490', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f6760312-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.581641641, 'message_signature': '68b987c3062ce483ffb1e6e27328134e157dc378efb15bdaa1b62bd8c47c3f2c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308-sda', 'timestamp': '2026-01-26T09:04:04.047490', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f676121c-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.581641641, 'message_signature': 'bb8c3b882bb145b41994d92cea1059dc85d50151724c34dffc3302cdab586e50'}]}, 'timestamp': '2026-01-26 09:04:04.049212', '_unique_id': 'b288fbe499c74ca5838c29cd83af60ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.050 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.051 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.051 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.device.write.bytes volume: 106496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.052 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.052 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/disk.device.write.bytes volume: 72978432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.052 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99d4a161-21b3-4d2c-b304-6ad0b970ac6f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 106496, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82-vda', 'timestamp': '2026-01-26T09:04:04.051626', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f676838c-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.534436873, 'message_signature': 'd4d5c94e0ae09279108c620414418d45523540915e53810d1cf7449626caa367'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82-sda', 'timestamp': '2026-01-26T09:04:04.051626', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'instance-0000002f', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f6769548-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.534436873, 'message_signature': '79d47a8a66bc96ce9c9aeb05659bca56590a89221a478ed0751a644a2da627ff'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72978432, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308-vda', 'timestamp': '2026-01-26T09:04:04.051626', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f676a4d4-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.581641641, 'message_signature': 'd683e326c773fcaa96882a30f8c0e7ff014fc68fabd818b1cdc45100d830315e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'd11cfdba-71df-4d49-a60f-29397352c308-sda', 'timestamp': '2026-01-26T09:04:04.051626', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'instance-00000030', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f676b528-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.581641641, 'message_signature': '8f5df04204b03f65b0fe23513d83d2b90bdbae052a05c3056d3ed479bf814ac2'}]}, 'timestamp': '2026-01-26 09:04:04.053322', '_unique_id': 'e3dee14e53824b1c801fe12d1781c796'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.054 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.055 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.055 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.056 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fb3ea10-3a1c-4f47-ab96-5e23e9ac9da4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002f-bcd4a434-a1cf-402b-87c3-8d39bc284a82-tap896fca22-d4', 'timestamp': '2026-01-26T09:04:04.055708', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'tap896fca22-d4', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:2a:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap896fca22-d4'}, 'message_id': 'f6772346-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.414159324, 'message_signature': 'efa668996d083dee497f0e8801c98cfe991ea2c2e9136fe72606a0db69d83ebb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-00000030-d11cfdba-71df-4d49-a60f-29397352c308-tap73d02031-13', 'timestamp': '2026-01-26T09:04:04.055708', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'tap73d02031-13', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5c:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap73d02031-13'}, 'message_id': 'f677382c-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.418537596, 'message_signature': '6049fc7d75b0e2478e5bae36064515be0e249342cd8a9d239fa3e526f174b5dd'}]}, 'timestamp': '2026-01-26 09:04:04.056697', '_unique_id': 'facbcb7c66ad491bb0c1b99cf987e594'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.057 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.059 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.059 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.059 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27e95094-fd01-476e-a9f4-21b11139f28b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002f-bcd4a434-a1cf-402b-87c3-8d39bc284a82-tap896fca22-d4', 'timestamp': '2026-01-26T09:04:04.059166', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'tap896fca22-d4', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:2a:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap896fca22-d4'}, 'message_id': 'f677a776-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.414159324, 'message_signature': '30f7b08a43bd5840d4d7d6b84d053ee79e95d067de36ba1d8db65cdc37c6b02d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-00000030-d11cfdba-71df-4d49-a60f-29397352c308-tap73d02031-13', 'timestamp': '2026-01-26T09:04:04.059166', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'tap73d02031-13', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5c:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap73d02031-13'}, 'message_id': 'f677b2f2-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.418537596, 'message_signature': 'f0d0df2b89e021ad75b2a29971979da360577d1b2640b3947fda0c7a989cc955'}]}, 'timestamp': '2026-01-26 09:04:04.059764', '_unique_id': '059578661bac422cb889311f75e931f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.060 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.061 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.061 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.061 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/network.incoming.packets volume: 44 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ba1d587-8e2f-4a86-a969-8c8f3a6ab3bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002f-bcd4a434-a1cf-402b-87c3-8d39bc284a82-tap896fca22-d4', 'timestamp': '2026-01-26T09:04:04.061362', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'tap896fca22-d4', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:2a:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap896fca22-d4'}, 'message_id': 'f677fd3e-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.414159324, 'message_signature': 'ecccdc642b3edd1224c084de4353ebf63a9144d9299f69d224a8e5e7fd9677bf'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 44, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-00000030-d11cfdba-71df-4d49-a60f-29397352c308-tap73d02031-13', 'timestamp': '2026-01-26T09:04:04.061362', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'tap73d02031-13', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5c:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap73d02031-13'}, 'message_id': 'f67808e2-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.418537596, 'message_signature': 'a98bff29038393af59392607bf2fe3044ab190b8f4588debd334197e280d2f4b'}]}, 'timestamp': '2026-01-26 09:04:04.061966', '_unique_id': '7f95c36e33db4a5ab0cfc4c7b1d7b740'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.062 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.063 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.063 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.063 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6708d35-2682-42b5-b3ae-68de5c3c6036', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002f-bcd4a434-a1cf-402b-87c3-8d39bc284a82-tap896fca22-d4', 'timestamp': '2026-01-26T09:04:04.063605', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'tap896fca22-d4', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:2a:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap896fca22-d4'}, 'message_id': 'f67854e6-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.414159324, 'message_signature': '880d6523ed22c78f85c8fc31aee117959bd650cb451971f9b92c88a1d99d7059'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-00000030-d11cfdba-71df-4d49-a60f-29397352c308-tap73d02031-13', 'timestamp': '2026-01-26T09:04:04.063605', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'tap73d02031-13', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5c:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap73d02031-13'}, 'message_id': 'f678604e-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.418537596, 'message_signature': '92b8e5889f99dc44a1a4df0be94d2c2e1d923b403897283f517b27809575a0f9'}]}, 'timestamp': '2026-01-26 09:04:04.064231', '_unique_id': 'fedd62d51b4d4040ad1e21053b489c71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:04 compute-1 nova_compute[183083]: 2026-01-26 09:04:04.070 183087 DEBUG nova.network.neutron [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Updating instance_info_cache with network_info: [{"id": "896fca22-d4bb-4060-89c0-72ac1b8f6dd7", "address": "fa:16:3e:e0:2a:17", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap896fca22-d4", "ovs_interfaceid": "896fca22-d4bb-4060-89c0-72ac1b8f6dd7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.064 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.065 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.065 12 DEBUG ceilometer.compute.pollsters [-] bcd4a434-a1cf-402b-87c3-8d39bc284a82/network.outgoing.bytes volume: 1390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.066 12 DEBUG ceilometer.compute.pollsters [-] d11cfdba-71df-4d49-a60f-29397352c308/network.outgoing.bytes volume: 5332 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c901576a-0137-459c-be6d-085142926807', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1390, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-0000002f-bcd4a434-a1cf-402b-87c3-8d39bc284a82-tap896fca22-d4', 'timestamp': '2026-01-26T09:04:04.065743', 'resource_metadata': {'display_name': 'tempest-server-test-1081491655', 'name': 'tap896fca22-d4', 'instance_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:2a:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap896fca22-d4'}, 'message_id': 'f678a842-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.414159324, 'message_signature': '116b722eec8699d39ac96c4ced9840bf0870b5758c48b319c914865d7704200b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5332, 'user_id': '90104736f4ab4d81b09d1ff11e40f454', 'user_name': None, 'project_id': '2580bb16c90849c4b5919eb271774a06', 'project_name': None, 'resource_id': 'instance-00000030-d11cfdba-71df-4d49-a60f-29397352c308-tap73d02031-13', 'timestamp': '2026-01-26T09:04:04.065743', 'resource_metadata': {'display_name': 'tempest-server-test-697225216', 'name': 'tap73d02031-13', 'instance_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'instance_type': 'm1.nano', 'host': '3addb614330e0d0a1f784cc610ad41c7742ffd6fed8e7c9b8119c625', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'a7017323-cd35-470d-a718-faa6c6e97277', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}, 'image_ref': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:5c:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap73d02031-13'}, 'message_id': 'f678b724-fa95-11f0-b28a-fa163efc69df', 'monotonic_time': 4584.418537596, 'message_signature': 'e1340d2eedaa85ade2a66272d914b8fca18e3a19a6b424f7cc24e13be38d5761'}]}, 'timestamp': '2026-01-26 09:04:04.066437', '_unique_id': '8fb580ba85b64334a7655dbeb1570d40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     yield
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 26 09:04:04 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:04:04.067 12 ERROR oslo_messaging.notify.messaging 
Jan 26 09:04:04 compute-1 nova_compute[183083]: 2026-01-26 09:04:04.091 183087 DEBUG oslo_concurrency.lockutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Releasing lock "refresh_cache-bcd4a434-a1cf-402b-87c3-8d39bc284a82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:04:04 compute-1 nova_compute[183083]: 2026-01-26 09:04:04.114 183087 DEBUG oslo_concurrency.lockutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:04:04 compute-1 nova_compute[183083]: 2026-01-26 09:04:04.115 183087 DEBUG oslo_concurrency.lockutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:04:04 compute-1 nova_compute[183083]: 2026-01-26 09:04:04.115 183087 DEBUG oslo_concurrency.lockutils [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:04:04 compute-1 nova_compute[183083]: 2026-01-26 09:04:04.121 183087 INFO nova.virt.libvirt.driver [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 26 09:04:04 compute-1 virtqemud[182752]: Domain id=17 name='instance-0000002f' uuid=bcd4a434-a1cf-402b-87c3-8d39bc284a82 is tainted: custom-monitor
Jan 26 09:04:04 compute-1 podman[222207]: 2026-01-26 09:04:04.840541898 +0000 UTC m=+0.091417404 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 09:04:05 compute-1 nova_compute[183083]: 2026-01-26 09:04:05.129 183087 INFO nova.virt.libvirt.driver [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 26 09:04:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:05.318 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:04:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:05.318 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:04:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:05.319 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:04:06 compute-1 nova_compute[183083]: 2026-01-26 09:04:06.137 183087 INFO nova.virt.libvirt.driver [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 26 09:04:06 compute-1 nova_compute[183083]: 2026-01-26 09:04:06.144 183087 DEBUG nova.compute.manager [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:04:06 compute-1 nova_compute[183083]: 2026-01-26 09:04:06.169 183087 DEBUG nova.objects.instance [None req-944793fe-c1d3-42d3-b267-3502b39c0285 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 26 09:04:07 compute-1 nova_compute[183083]: 2026-01-26 09:04:07.760 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:09 compute-1 nova_compute[183083]: 2026-01-26 09:04:09.013 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:10 compute-1 nova_compute[183083]: 2026-01-26 09:04:10.023 183087 DEBUG nova.virt.libvirt.driver [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Check if temp file /var/lib/nova/instances/tmprov6ne4d exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 26 09:04:10 compute-1 nova_compute[183083]: 2026-01-26 09:04:10.024 183087 DEBUG nova.compute.manager [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=115712,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprov6ne4d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d11cfdba-71df-4d49-a60f-29397352c308',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 26 09:04:11 compute-1 nova_compute[183083]: 2026-01-26 09:04:11.343 183087 DEBUG oslo_concurrency.processutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:04:11 compute-1 nova_compute[183083]: 2026-01-26 09:04:11.434 183087 DEBUG oslo_concurrency.processutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:04:11 compute-1 nova_compute[183083]: 2026-01-26 09:04:11.435 183087 DEBUG oslo_concurrency.processutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:04:11 compute-1 nova_compute[183083]: 2026-01-26 09:04:11.497 183087 DEBUG oslo_concurrency.processutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:04:12 compute-1 nova_compute[183083]: 2026-01-26 09:04:12.784 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:13 compute-1 nova_compute[183083]: 2026-01-26 09:04:13.808 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:04:13 compute-1 nova_compute[183083]: 2026-01-26 09:04:13.808 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:04:13 compute-1 nova_compute[183083]: 2026-01-26 09:04:13.809 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:04:13 compute-1 sshd-session[222237]: Accepted publickey for nova from 192.168.122.100 port 36960 ssh2: ECDSA SHA256:Yk+gMnjvz6H0Gj+fLzXKoFN6HWaghuhSmd0pdmuIFmU
Jan 26 09:04:13 compute-1 systemd-logind[788]: New session 71 of user nova.
Jan 26 09:04:13 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Jan 26 09:04:13 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 26 09:04:13 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 26 09:04:13 compute-1 systemd[1]: Starting User Manager for UID 42436...
Jan 26 09:04:13 compute-1 systemd[222241]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 26 09:04:14 compute-1 nova_compute[183083]: 2026-01-26 09:04:14.017 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:14 compute-1 systemd[222241]: Queued start job for default target Main User Target.
Jan 26 09:04:14 compute-1 systemd[222241]: Created slice User Application Slice.
Jan 26 09:04:14 compute-1 systemd[222241]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 26 09:04:14 compute-1 systemd[222241]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 09:04:14 compute-1 systemd[222241]: Reached target Paths.
Jan 26 09:04:14 compute-1 systemd[222241]: Reached target Timers.
Jan 26 09:04:14 compute-1 systemd[222241]: Starting D-Bus User Message Bus Socket...
Jan 26 09:04:14 compute-1 systemd[222241]: Starting Create User's Volatile Files and Directories...
Jan 26 09:04:14 compute-1 systemd[222241]: Listening on D-Bus User Message Bus Socket.
Jan 26 09:04:14 compute-1 systemd[222241]: Reached target Sockets.
Jan 26 09:04:14 compute-1 systemd[222241]: Finished Create User's Volatile Files and Directories.
Jan 26 09:04:14 compute-1 systemd[222241]: Reached target Basic System.
Jan 26 09:04:14 compute-1 systemd[222241]: Reached target Main User Target.
Jan 26 09:04:14 compute-1 systemd[222241]: Startup finished in 142ms.
Jan 26 09:04:14 compute-1 systemd[1]: Started User Manager for UID 42436.
Jan 26 09:04:14 compute-1 systemd[1]: Started Session 71 of User nova.
Jan 26 09:04:14 compute-1 sshd-session[222237]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 26 09:04:14 compute-1 sshd-session[222256]: Received disconnect from 192.168.122.100 port 36960:11: disconnected by user
Jan 26 09:04:14 compute-1 sshd-session[222256]: Disconnected from user nova 192.168.122.100 port 36960
Jan 26 09:04:14 compute-1 sshd-session[222237]: pam_unix(sshd:session): session closed for user nova
Jan 26 09:04:14 compute-1 systemd[1]: session-71.scope: Deactivated successfully.
Jan 26 09:04:14 compute-1 systemd-logind[788]: Session 71 logged out. Waiting for processes to exit.
Jan 26 09:04:14 compute-1 systemd-logind[788]: Removed session 71.
Jan 26 09:04:14 compute-1 nova_compute[183083]: 2026-01-26 09:04:14.966 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "refresh_cache-bcd4a434-a1cf-402b-87c3-8d39bc284a82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:04:14 compute-1 nova_compute[183083]: 2026-01-26 09:04:14.967 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquired lock "refresh_cache-bcd4a434-a1cf-402b-87c3-8d39bc284a82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:04:14 compute-1 nova_compute[183083]: 2026-01-26 09:04:14.968 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 09:04:14 compute-1 nova_compute[183083]: 2026-01-26 09:04:14.968 183087 DEBUG nova.objects.instance [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lazy-loading 'info_cache' on Instance uuid bcd4a434-a1cf-402b-87c3-8d39bc284a82 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:04:16 compute-1 nova_compute[183083]: 2026-01-26 09:04:16.233 183087 DEBUG nova.compute.manager [req-327c0d81-a06e-4637-9dcd-15f7c7517331 req-77f40bad-4a98-4dea-87a2-3f37e701f513 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received event network-vif-unplugged-73d02031-13f7-436d-bec5-c981a5c6c99b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:04:16 compute-1 nova_compute[183083]: 2026-01-26 09:04:16.234 183087 DEBUG oslo_concurrency.lockutils [req-327c0d81-a06e-4637-9dcd-15f7c7517331 req-77f40bad-4a98-4dea-87a2-3f37e701f513 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "d11cfdba-71df-4d49-a60f-29397352c308-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:04:16 compute-1 nova_compute[183083]: 2026-01-26 09:04:16.234 183087 DEBUG oslo_concurrency.lockutils [req-327c0d81-a06e-4637-9dcd-15f7c7517331 req-77f40bad-4a98-4dea-87a2-3f37e701f513 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:04:16 compute-1 nova_compute[183083]: 2026-01-26 09:04:16.235 183087 DEBUG oslo_concurrency.lockutils [req-327c0d81-a06e-4637-9dcd-15f7c7517331 req-77f40bad-4a98-4dea-87a2-3f37e701f513 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:04:16 compute-1 nova_compute[183083]: 2026-01-26 09:04:16.235 183087 DEBUG nova.compute.manager [req-327c0d81-a06e-4637-9dcd-15f7c7517331 req-77f40bad-4a98-4dea-87a2-3f37e701f513 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] No waiting events found dispatching network-vif-unplugged-73d02031-13f7-436d-bec5-c981a5c6c99b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:04:16 compute-1 nova_compute[183083]: 2026-01-26 09:04:16.235 183087 DEBUG nova.compute.manager [req-327c0d81-a06e-4637-9dcd-15f7c7517331 req-77f40bad-4a98-4dea-87a2-3f37e701f513 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received event network-vif-unplugged-73d02031-13f7-436d-bec5-c981a5c6c99b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 09:04:17 compute-1 nova_compute[183083]: 2026-01-26 09:04:17.789 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.150 183087 INFO nova.compute.manager [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Took 6.65 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.151 183087 DEBUG nova.compute.manager [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.184 183087 DEBUG nova.compute.manager [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=115712,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprov6ne4d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d11cfdba-71df-4d49-a60f-29397352c308',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(1ff6e901-c30d-47f8-a58d-94967b04bf1e),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.211 183087 DEBUG nova.objects.instance [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lazy-loading 'migration_context' on Instance uuid d11cfdba-71df-4d49-a60f-29397352c308 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.213 183087 DEBUG nova.virt.libvirt.driver [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.215 183087 DEBUG nova.virt.libvirt.driver [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.216 183087 DEBUG nova.virt.libvirt.driver [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.233 183087 DEBUG nova.virt.libvirt.vif [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T09:02:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-697225216',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-697225216',id=48,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCVUzFf6wSdyelY/UeY7cqTv8O0no1sfyU2c1QC2Iq0A42faUChCXJr9C1AIPerWETKRC47CWhTspqdq2Y/jF5MXoBNhFYSHEELzhDImF6CTLJqQZ+4txACe/VixshXUeg==',key_name='tempest-keypair-47432938',keypairs=<?>,launch_index=0,launched_at=2026-01-26T09:03:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2580bb16c90849c4b5919eb271774a06',ramdisk_id='',reservation_id='r-sbe72h54',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-691788706',owner_user_name='tempest-OvnDvrTest-691788706-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T09:03:04Z,user_data=None,user_id='90104736f4ab4d81b09d1ff11e40f454',uuid=d11cfdba-71df-4d49-a60f-29397352c308,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "73d02031-13f7-436d-bec5-c981a5c6c99b", "address": "fa:16:3e:b0:5c:01", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap73d02031-13", "ovs_interfaceid": "73d02031-13f7-436d-bec5-c981a5c6c99b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.234 183087 DEBUG nova.network.os_vif_util [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converting VIF {"id": "73d02031-13f7-436d-bec5-c981a5c6c99b", "address": "fa:16:3e:b0:5c:01", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap73d02031-13", "ovs_interfaceid": "73d02031-13f7-436d-bec5-c981a5c6c99b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.235 183087 DEBUG nova.network.os_vif_util [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:5c:01,bridge_name='br-int',has_traffic_filtering=True,id=73d02031-13f7-436d-bec5-c981a5c6c99b,network=Network(a7990128-24d3-4373-9624-cea49e6db86a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73d02031-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.236 183087 DEBUG nova.virt.libvirt.migration [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 09:04:18 compute-1 nova_compute[183083]:   <mac address="fa:16:3e:b0:5c:01"/>
Jan 26 09:04:18 compute-1 nova_compute[183083]:   <model type="virtio"/>
Jan 26 09:04:18 compute-1 nova_compute[183083]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 09:04:18 compute-1 nova_compute[183083]:   <mtu size="1342"/>
Jan 26 09:04:18 compute-1 nova_compute[183083]:   <target dev="tap73d02031-13"/>
Jan 26 09:04:18 compute-1 nova_compute[183083]: </interface>
Jan 26 09:04:18 compute-1 nova_compute[183083]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.237 183087 DEBUG nova.virt.libvirt.driver [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.302 183087 DEBUG nova.compute.manager [req-ffa06936-900b-4245-8238-a58269432da1 req-e01a9341-a69f-4e63-9556-0e9392c6ddfb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received event network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.303 183087 DEBUG oslo_concurrency.lockutils [req-ffa06936-900b-4245-8238-a58269432da1 req-e01a9341-a69f-4e63-9556-0e9392c6ddfb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "d11cfdba-71df-4d49-a60f-29397352c308-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.304 183087 DEBUG oslo_concurrency.lockutils [req-ffa06936-900b-4245-8238-a58269432da1 req-e01a9341-a69f-4e63-9556-0e9392c6ddfb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.305 183087 DEBUG oslo_concurrency.lockutils [req-ffa06936-900b-4245-8238-a58269432da1 req-e01a9341-a69f-4e63-9556-0e9392c6ddfb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.305 183087 DEBUG nova.compute.manager [req-ffa06936-900b-4245-8238-a58269432da1 req-e01a9341-a69f-4e63-9556-0e9392c6ddfb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] No waiting events found dispatching network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.306 183087 WARNING nova.compute.manager [req-ffa06936-900b-4245-8238-a58269432da1 req-e01a9341-a69f-4e63-9556-0e9392c6ddfb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received unexpected event network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b for instance with vm_state active and task_state migrating.
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.306 183087 DEBUG nova.compute.manager [req-ffa06936-900b-4245-8238-a58269432da1 req-e01a9341-a69f-4e63-9556-0e9392c6ddfb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received event network-changed-73d02031-13f7-436d-bec5-c981a5c6c99b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.307 183087 DEBUG nova.compute.manager [req-ffa06936-900b-4245-8238-a58269432da1 req-e01a9341-a69f-4e63-9556-0e9392c6ddfb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Refreshing instance network info cache due to event network-changed-73d02031-13f7-436d-bec5-c981a5c6c99b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.308 183087 DEBUG oslo_concurrency.lockutils [req-ffa06936-900b-4245-8238-a58269432da1 req-e01a9341-a69f-4e63-9556-0e9392c6ddfb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-d11cfdba-71df-4d49-a60f-29397352c308" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.308 183087 DEBUG oslo_concurrency.lockutils [req-ffa06936-900b-4245-8238-a58269432da1 req-e01a9341-a69f-4e63-9556-0e9392c6ddfb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-d11cfdba-71df-4d49-a60f-29397352c308" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.309 183087 DEBUG nova.network.neutron [req-ffa06936-900b-4245-8238-a58269432da1 req-e01a9341-a69f-4e63-9556-0e9392c6ddfb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Refreshing network info cache for port 73d02031-13f7-436d-bec5-c981a5c6c99b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.719 183087 DEBUG nova.virt.libvirt.migration [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.720 183087 INFO nova.virt.libvirt.migration [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 09:04:18 compute-1 nova_compute[183083]: 2026-01-26 09:04:18.775 183087 INFO nova.virt.libvirt.driver [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 09:04:19 compute-1 nova_compute[183083]: 2026-01-26 09:04:19.020 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:19 compute-1 nova_compute[183083]: 2026-01-26 09:04:19.278 183087 DEBUG nova.virt.libvirt.migration [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 26 09:04:19 compute-1 nova_compute[183083]: 2026-01-26 09:04:19.279 183087 DEBUG nova.virt.libvirt.migration [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 26 09:04:19 compute-1 nova_compute[183083]: 2026-01-26 09:04:19.783 183087 DEBUG nova.virt.libvirt.migration [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 26 09:04:19 compute-1 nova_compute[183083]: 2026-01-26 09:04:19.783 183087 DEBUG nova.virt.libvirt.migration [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 26 09:04:19 compute-1 podman[222275]: 2026-01-26 09:04:19.798029958 +0000 UTC m=+0.062575749 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 26 09:04:19 compute-1 podman[222274]: 2026-01-26 09:04:19.823482509 +0000 UTC m=+0.086206309 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.001 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Updating instance_info_cache with network_info: [{"id": "896fca22-d4bb-4060-89c0-72ac1b8f6dd7", "address": "fa:16:3e:e0:2a:17", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap896fca22-d4", "ovs_interfaceid": "896fca22-d4bb-4060-89c0-72ac1b8f6dd7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.019 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Releasing lock "refresh_cache-bcd4a434-a1cf-402b-87c3-8d39bc284a82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.020 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.021 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.023 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.024 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.024 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.025 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.026 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.026 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.303 183087 DEBUG nova.virt.libvirt.migration [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.304 183087 DEBUG nova.virt.libvirt.migration [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.307 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769418260.3072886, d11cfdba-71df-4d49-a60f-29397352c308 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.308 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] VM Paused (Lifecycle Event)
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.331 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.336 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.358 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 26 09:04:20 compute-1 kernel: tap73d02031-13 (unregistering): left promiscuous mode
Jan 26 09:04:20 compute-1 NetworkManager[55451]: <info>  [1769418260.4536] device (tap73d02031-13): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 09:04:20 compute-1 ovn_controller[95352]: 2026-01-26T09:04:20Z|00279|binding|INFO|Releasing lport 73d02031-13f7-436d-bec5-c981a5c6c99b from this chassis (sb_readonly=0)
Jan 26 09:04:20 compute-1 ovn_controller[95352]: 2026-01-26T09:04:20Z|00280|binding|INFO|Setting lport 73d02031-13f7-436d-bec5-c981a5c6c99b down in Southbound
Jan 26 09:04:20 compute-1 ovn_controller[95352]: 2026-01-26T09:04:20Z|00281|binding|INFO|Removing iface tap73d02031-13 ovn-installed in OVS
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.468 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.493 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:20.494 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:5c:01 10.100.0.28'], port_security=['fa:16:3e:b0:5c:01 10.100.0.28'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b62a0d99-b340-4a52-961e-b6a31b1ea8c8'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.28/28', 'neutron:device_id': 'd11cfdba-71df-4d49-a60f-29397352c308', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a7990128-24d3-4373-9624-cea49e6db86a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2580bb16c90849c4b5919eb271774a06', 'neutron:revision_number': '8', 'neutron:security_group_ids': '30ca606d-e02a-4090-8787-8ade3dd6e02d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.233'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6e8b8dd-7a7a-42f2-93d9-90eaac5d38d1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=73d02031-13f7-436d-bec5-c981a5c6c99b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:04:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:20.497 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 73d02031-13f7-436d-bec5-c981a5c6c99b in datapath a7990128-24d3-4373-9624-cea49e6db86a unbound from our chassis
Jan 26 09:04:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:20.499 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a7990128-24d3-4373-9624-cea49e6db86a
Jan 26 09:04:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:20.524 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[96ecf24f-4232-4bb8-a242-b3410a413b23]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:04:20 compute-1 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000030.scope: Deactivated successfully.
Jan 26 09:04:20 compute-1 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000030.scope: Consumed 15.792s CPU time.
Jan 26 09:04:20 compute-1 systemd-machined[154360]: Machine qemu-16-instance-00000030 terminated.
Jan 26 09:04:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:20.567 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[7404125d-5252-4430-b65a-ff171c5ae732]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:04:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:20.574 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[b697d75f-1f91-4f53-845b-97be313e2d5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:04:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:20.611 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[f355eb1d-904b-45b4-a6b9-3072ed491709]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:04:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:20.642 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[be1afe2a-b428-4e90-8fc0-7dc41e35790f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa7990128-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d3:fd:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452379, 'reachable_time': 20812, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222325, 'error': None, 'target': 'ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:04:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:20.671 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[d3b15aec-a05b-4f2d-bd63-c6445692638c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa7990128-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 452390, 'tstamp': 452390}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222331, 'error': None, 'target': 'ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.18'], ['IFA_LOCAL', '10.100.0.18'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapa7990128-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 452393, 'tstamp': 452393}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222331, 'error': None, 'target': 'ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:04:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:20.673 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa7990128-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.676 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.681 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:20.682 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa7990128-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:04:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:20.682 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:04:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:20.683 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa7990128-20, col_values=(('external_ids', {'iface-id': 'e4a9eb60-6fc2-4712-a497-5aead39b5136'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:04:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:04:20.683 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.698 183087 DEBUG nova.virt.libvirt.driver [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.699 183087 DEBUG nova.virt.libvirt.driver [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.699 183087 DEBUG nova.virt.libvirt.driver [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.808 183087 DEBUG nova.virt.libvirt.guest [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'd11cfdba-71df-4d49-a60f-29397352c308' (instance-00000030) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.809 183087 INFO nova.virt.libvirt.driver [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Migration operation has completed
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.809 183087 INFO nova.compute.manager [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] _post_live_migration() is started..
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.973 183087 DEBUG nova.network.neutron [req-ffa06936-900b-4245-8238-a58269432da1 req-e01a9341-a69f-4e63-9556-0e9392c6ddfb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Updated VIF entry in instance network info cache for port 73d02031-13f7-436d-bec5-c981a5c6c99b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:04:20 compute-1 nova_compute[183083]: 2026-01-26 09:04:20.974 183087 DEBUG nova.network.neutron [req-ffa06936-900b-4245-8238-a58269432da1 req-e01a9341-a69f-4e63-9556-0e9392c6ddfb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Updating instance_info_cache with network_info: [{"id": "73d02031-13f7-436d-bec5-c981a5c6c99b", "address": "fa:16:3e:b0:5c:01", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73d02031-13", "ovs_interfaceid": "73d02031-13f7-436d-bec5-c981a5c6c99b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.012 183087 DEBUG oslo_concurrency.lockutils [req-ffa06936-900b-4245-8238-a58269432da1 req-e01a9341-a69f-4e63-9556-0e9392c6ddfb 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-d11cfdba-71df-4d49-a60f-29397352c308" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.117 183087 DEBUG nova.compute.manager [req-a2d51e9d-54bf-4186-b651-df18abe97626 req-4c65c3e8-3490-4573-a84a-e72c9ce10d62 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received event network-vif-unplugged-73d02031-13f7-436d-bec5-c981a5c6c99b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.117 183087 DEBUG oslo_concurrency.lockutils [req-a2d51e9d-54bf-4186-b651-df18abe97626 req-4c65c3e8-3490-4573-a84a-e72c9ce10d62 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "d11cfdba-71df-4d49-a60f-29397352c308-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.118 183087 DEBUG oslo_concurrency.lockutils [req-a2d51e9d-54bf-4186-b651-df18abe97626 req-4c65c3e8-3490-4573-a84a-e72c9ce10d62 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.118 183087 DEBUG oslo_concurrency.lockutils [req-a2d51e9d-54bf-4186-b651-df18abe97626 req-4c65c3e8-3490-4573-a84a-e72c9ce10d62 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.118 183087 DEBUG nova.compute.manager [req-a2d51e9d-54bf-4186-b651-df18abe97626 req-4c65c3e8-3490-4573-a84a-e72c9ce10d62 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] No waiting events found dispatching network-vif-unplugged-73d02031-13f7-436d-bec5-c981a5c6c99b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.118 183087 DEBUG nova.compute.manager [req-a2d51e9d-54bf-4186-b651-df18abe97626 req-4c65c3e8-3490-4573-a84a-e72c9ce10d62 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received event network-vif-unplugged-73d02031-13f7-436d-bec5-c981a5c6c99b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.344 183087 DEBUG nova.compute.manager [req-91370569-0361-44fd-bdd6-820bf7fc842c req-c1dbbe17-768a-4ce4-b61b-544afe851417 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received event network-vif-unplugged-73d02031-13f7-436d-bec5-c981a5c6c99b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.345 183087 DEBUG oslo_concurrency.lockutils [req-91370569-0361-44fd-bdd6-820bf7fc842c req-c1dbbe17-768a-4ce4-b61b-544afe851417 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "d11cfdba-71df-4d49-a60f-29397352c308-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.345 183087 DEBUG oslo_concurrency.lockutils [req-91370569-0361-44fd-bdd6-820bf7fc842c req-c1dbbe17-768a-4ce4-b61b-544afe851417 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.345 183087 DEBUG oslo_concurrency.lockutils [req-91370569-0361-44fd-bdd6-820bf7fc842c req-c1dbbe17-768a-4ce4-b61b-544afe851417 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.346 183087 DEBUG nova.compute.manager [req-91370569-0361-44fd-bdd6-820bf7fc842c req-c1dbbe17-768a-4ce4-b61b-544afe851417 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] No waiting events found dispatching network-vif-unplugged-73d02031-13f7-436d-bec5-c981a5c6c99b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.347 183087 DEBUG nova.compute.manager [req-91370569-0361-44fd-bdd6-820bf7fc842c req-c1dbbe17-768a-4ce4-b61b-544afe851417 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received event network-vif-unplugged-73d02031-13f7-436d-bec5-c981a5c6c99b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.467 183087 DEBUG nova.network.neutron [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Activated binding for port 73d02031-13f7-436d-bec5-c981a5c6c99b and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.468 183087 DEBUG nova.compute.manager [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "73d02031-13f7-436d-bec5-c981a5c6c99b", "address": "fa:16:3e:b0:5c:01", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73d02031-13", "ovs_interfaceid": "73d02031-13f7-436d-bec5-c981a5c6c99b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.469 183087 DEBUG nova.virt.libvirt.vif [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T09:02:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-697225216',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-697225216',id=48,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCVUzFf6wSdyelY/UeY7cqTv8O0no1sfyU2c1QC2Iq0A42faUChCXJr9C1AIPerWETKRC47CWhTspqdq2Y/jF5MXoBNhFYSHEELzhDImF6CTLJqQZ+4txACe/VixshXUeg==',key_name='tempest-keypair-47432938',keypairs=<?>,launch_index=0,launched_at=2026-01-26T09:03:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2580bb16c90849c4b5919eb271774a06',ramdisk_id='',reservation_id='r-sbe72h54',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-691788706',owner_user_name='tempest-OvnDvrTest-691788706-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T09:04:08Z,user_data=None,user_id='90104736f4ab4d81b09d1ff11e40f454',uuid=d11cfdba-71df-4d49-a60f-29397352c308,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "73d02031-13f7-436d-bec5-c981a5c6c99b", "address": "fa:16:3e:b0:5c:01", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73d02031-13", "ovs_interfaceid": "73d02031-13f7-436d-bec5-c981a5c6c99b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.469 183087 DEBUG nova.network.os_vif_util [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converting VIF {"id": "73d02031-13f7-436d-bec5-c981a5c6c99b", "address": "fa:16:3e:b0:5c:01", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73d02031-13", "ovs_interfaceid": "73d02031-13f7-436d-bec5-c981a5c6c99b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.470 183087 DEBUG nova.network.os_vif_util [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:5c:01,bridge_name='br-int',has_traffic_filtering=True,id=73d02031-13f7-436d-bec5-c981a5c6c99b,network=Network(a7990128-24d3-4373-9624-cea49e6db86a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73d02031-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.470 183087 DEBUG os_vif [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:5c:01,bridge_name='br-int',has_traffic_filtering=True,id=73d02031-13f7-436d-bec5-c981a5c6c99b,network=Network(a7990128-24d3-4373-9624-cea49e6db86a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73d02031-13') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.472 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.473 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap73d02031-13, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.512 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.515 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.518 183087 INFO os_vif [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:5c:01,bridge_name='br-int',has_traffic_filtering=True,id=73d02031-13f7-436d-bec5-c981a5c6c99b,network=Network(a7990128-24d3-4373-9624-cea49e6db86a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73d02031-13')
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.518 183087 DEBUG oslo_concurrency.lockutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.519 183087 DEBUG oslo_concurrency.lockutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.519 183087 DEBUG oslo_concurrency.lockutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.519 183087 DEBUG nova.compute.manager [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.520 183087 INFO nova.virt.libvirt.driver [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Deleting instance files /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308_del
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.521 183087 INFO nova.virt.libvirt.driver [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Deletion of /var/lib/nova/instances/d11cfdba-71df-4d49-a60f-29397352c308_del complete
Jan 26 09:04:21 compute-1 podman[222346]: 2026-01-26 09:04:21.812124795 +0000 UTC m=+0.063565796 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:04:21 compute-1 podman[222347]: 2026-01-26 09:04:21.848067539 +0000 UTC m=+0.089558832 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 09:04:21 compute-1 podman[222345]: 2026-01-26 09:04:21.89861157 +0000 UTC m=+0.149326471 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.975 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.976 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.976 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:04:21 compute-1 nova_compute[183083]: 2026-01-26 09:04:21.977 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:04:22 compute-1 nova_compute[183083]: 2026-01-26 09:04:22.071 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:04:22 compute-1 nova_compute[183083]: 2026-01-26 09:04:22.150 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:04:22 compute-1 nova_compute[183083]: 2026-01-26 09:04:22.152 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:04:22 compute-1 nova_compute[183083]: 2026-01-26 09:04:22.241 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:04:22 compute-1 nova_compute[183083]: 2026-01-26 09:04:22.435 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:04:22 compute-1 nova_compute[183083]: 2026-01-26 09:04:22.436 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13526MB free_disk=113.06478500366211GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:04:22 compute-1 nova_compute[183083]: 2026-01-26 09:04:22.437 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:04:22 compute-1 nova_compute[183083]: 2026-01-26 09:04:22.437 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:04:22 compute-1 nova_compute[183083]: 2026-01-26 09:04:22.495 183087 INFO nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Updating resource usage from migration 1ff6e901-c30d-47f8-a58d-94967b04bf1e
Jan 26 09:04:22 compute-1 nova_compute[183083]: 2026-01-26 09:04:22.536 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance bcd4a434-a1cf-402b-87c3-8d39bc284a82 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 09:04:22 compute-1 nova_compute[183083]: 2026-01-26 09:04:22.537 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Migration 1ff6e901-c30d-47f8-a58d-94967b04bf1e is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 26 09:04:22 compute-1 nova_compute[183083]: 2026-01-26 09:04:22.537 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:04:22 compute-1 nova_compute[183083]: 2026-01-26 09:04:22.537 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=768MB phys_disk=119GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:04:22 compute-1 nova_compute[183083]: 2026-01-26 09:04:22.592 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:04:22 compute-1 nova_compute[183083]: 2026-01-26 09:04:22.606 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:04:22 compute-1 nova_compute[183083]: 2026-01-26 09:04:22.629 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:04:22 compute-1 nova_compute[183083]: 2026-01-26 09:04:22.629 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:04:22 compute-1 nova_compute[183083]: 2026-01-26 09:04:22.792 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.199 183087 DEBUG nova.compute.manager [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received event network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.199 183087 DEBUG oslo_concurrency.lockutils [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "d11cfdba-71df-4d49-a60f-29397352c308-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.200 183087 DEBUG oslo_concurrency.lockutils [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.200 183087 DEBUG oslo_concurrency.lockutils [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.200 183087 DEBUG nova.compute.manager [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] No waiting events found dispatching network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.200 183087 WARNING nova.compute.manager [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received unexpected event network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b for instance with vm_state active and task_state migrating.
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.201 183087 DEBUG nova.compute.manager [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received event network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.201 183087 DEBUG oslo_concurrency.lockutils [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "d11cfdba-71df-4d49-a60f-29397352c308-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.201 183087 DEBUG oslo_concurrency.lockutils [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.201 183087 DEBUG oslo_concurrency.lockutils [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.201 183087 DEBUG nova.compute.manager [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] No waiting events found dispatching network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.202 183087 WARNING nova.compute.manager [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received unexpected event network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b for instance with vm_state active and task_state migrating.
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.202 183087 DEBUG nova.compute.manager [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received event network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.202 183087 DEBUG oslo_concurrency.lockutils [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "d11cfdba-71df-4d49-a60f-29397352c308-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.202 183087 DEBUG oslo_concurrency.lockutils [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.202 183087 DEBUG oslo_concurrency.lockutils [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.203 183087 DEBUG nova.compute.manager [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] No waiting events found dispatching network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.203 183087 WARNING nova.compute.manager [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received unexpected event network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b for instance with vm_state active and task_state migrating.
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.203 183087 DEBUG nova.compute.manager [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received event network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.203 183087 DEBUG oslo_concurrency.lockutils [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "d11cfdba-71df-4d49-a60f-29397352c308-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.203 183087 DEBUG oslo_concurrency.lockutils [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.204 183087 DEBUG oslo_concurrency.lockutils [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.204 183087 DEBUG nova.compute.manager [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] No waiting events found dispatching network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:04:23 compute-1 nova_compute[183083]: 2026-01-26 09:04:23.204 183087 WARNING nova.compute.manager [req-2a2cf9fe-89ac-4acb-a3c1-b9983dcd544d req-2b643d1d-b8ed-46a9-bc9f-21a191bfaa25 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Received unexpected event network-vif-plugged-73d02031-13f7-436d-bec5-c981a5c6c99b for instance with vm_state active and task_state migrating.
Jan 26 09:04:24 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Jan 26 09:04:24 compute-1 systemd[222241]: Activating special unit Exit the Session...
Jan 26 09:04:24 compute-1 systemd[222241]: Stopped target Main User Target.
Jan 26 09:04:24 compute-1 systemd[222241]: Stopped target Basic System.
Jan 26 09:04:24 compute-1 systemd[222241]: Stopped target Paths.
Jan 26 09:04:24 compute-1 systemd[222241]: Stopped target Sockets.
Jan 26 09:04:24 compute-1 systemd[222241]: Stopped target Timers.
Jan 26 09:04:24 compute-1 systemd[222241]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 26 09:04:24 compute-1 systemd[222241]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 26 09:04:24 compute-1 systemd[222241]: Closed D-Bus User Message Bus Socket.
Jan 26 09:04:24 compute-1 systemd[222241]: Stopped Create User's Volatile Files and Directories.
Jan 26 09:04:24 compute-1 systemd[222241]: Removed slice User Application Slice.
Jan 26 09:04:24 compute-1 systemd[222241]: Reached target Shutdown.
Jan 26 09:04:24 compute-1 systemd[222241]: Finished Exit the Session.
Jan 26 09:04:24 compute-1 systemd[222241]: Reached target Exit the Session.
Jan 26 09:04:24 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Jan 26 09:04:24 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Jan 26 09:04:24 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 26 09:04:24 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 26 09:04:24 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 26 09:04:24 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 26 09:04:24 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Jan 26 09:04:26 compute-1 nova_compute[183083]: 2026-01-26 09:04:26.514 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.061 183087 DEBUG oslo_concurrency.lockutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "d11cfdba-71df-4d49-a60f-29397352c308-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.061 183087 DEBUG oslo_concurrency.lockutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.062 183087 DEBUG oslo_concurrency.lockutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "d11cfdba-71df-4d49-a60f-29397352c308-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.087 183087 DEBUG oslo_concurrency.lockutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.088 183087 DEBUG oslo_concurrency.lockutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.088 183087 DEBUG oslo_concurrency.lockutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.089 183087 DEBUG nova.compute.resource_tracker [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.161 183087 DEBUG oslo_concurrency.processutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.238 183087 DEBUG oslo_concurrency.processutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.241 183087 DEBUG oslo_concurrency.processutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.302 183087 DEBUG oslo_concurrency.processutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.485 183087 WARNING nova.virt.libvirt.driver [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.486 183087 DEBUG nova.compute.resource_tracker [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13522MB free_disk=113.06478500366211GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.486 183087 DEBUG oslo_concurrency.lockutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.487 183087 DEBUG oslo_concurrency.lockutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.535 183087 DEBUG nova.compute.resource_tracker [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Migration for instance d11cfdba-71df-4d49-a60f-29397352c308 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.551 183087 DEBUG nova.compute.resource_tracker [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.568 183087 DEBUG nova.compute.resource_tracker [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Instance bcd4a434-a1cf-402b-87c3-8d39bc284a82 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.569 183087 DEBUG nova.compute.resource_tracker [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Migration 1ff6e901-c30d-47f8-a58d-94967b04bf1e is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.569 183087 DEBUG nova.compute.resource_tracker [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.569 183087 DEBUG nova.compute.resource_tracker [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.630 183087 DEBUG nova.compute.provider_tree [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.642 183087 DEBUG nova.scheduler.client.report [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.673 183087 DEBUG nova.compute.resource_tracker [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.674 183087 DEBUG oslo_concurrency.lockutils [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.680 183087 INFO nova.compute.manager [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.777 183087 INFO nova.scheduler.client.report [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Deleted allocation for migration 1ff6e901-c30d-47f8-a58d-94967b04bf1e
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.778 183087 DEBUG nova.virt.libvirt.driver [None req-a09564b1-1541-4a77-bfd7-5b9556cde698 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 26 09:04:27 compute-1 nova_compute[183083]: 2026-01-26 09:04:27.794 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:30 compute-1 ovn_controller[95352]: 2026-01-26T09:04:30Z|00282|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 26 09:04:30 compute-1 sshd-session[222427]: Accepted publickey for zuul from 38.102.83.66 port 46626 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:04:30 compute-1 systemd-logind[788]: New session 73 of user zuul.
Jan 26 09:04:30 compute-1 systemd[1]: Started Session 73 of User zuul.
Jan 26 09:04:30 compute-1 sshd-session[222427]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:04:30 compute-1 sshd-session[222431]: Accepted publickey for zuul from 38.102.83.66 port 46636 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:04:30 compute-1 systemd-logind[788]: New session 74 of user zuul.
Jan 26 09:04:30 compute-1 systemd[1]: Started Session 74 of User zuul.
Jan 26 09:04:30 compute-1 sshd-session[222431]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:04:30 compute-1 sudo[222435]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:04:30 compute-1 sudo[222435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:04:30 compute-1 sudo[222435]: pam_unix(sudo:session): session closed for user root
Jan 26 09:04:30 compute-1 sudo[222460]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni eth1 icmp and ether host fa:16:3e:b0:ef:ce -w /tmp/tmp.3CshN8EHUr
Jan 26 09:04:30 compute-1 sudo[222460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:04:31 compute-1 sshd-session[222434]: Connection closed by 38.102.83.66 port 46636
Jan 26 09:04:31 compute-1 sshd-session[222431]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:04:31 compute-1 systemd[1]: session-74.scope: Deactivated successfully.
Jan 26 09:04:31 compute-1 systemd-logind[788]: Session 74 logged out. Waiting for processes to exit.
Jan 26 09:04:31 compute-1 systemd-logind[788]: Removed session 74.
Jan 26 09:04:31 compute-1 nova_compute[183083]: 2026-01-26 09:04:31.518 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:32 compute-1 nova_compute[183083]: 2026-01-26 09:04:32.797 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:35 compute-1 nova_compute[183083]: 2026-01-26 09:04:35.696 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769418260.6945846, d11cfdba-71df-4d49-a60f-29397352c308 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:04:35 compute-1 nova_compute[183083]: 2026-01-26 09:04:35.697 183087 INFO nova.compute.manager [-] [instance: d11cfdba-71df-4d49-a60f-29397352c308] VM Stopped (Lifecycle Event)
Jan 26 09:04:35 compute-1 nova_compute[183083]: 2026-01-26 09:04:35.717 183087 DEBUG nova.compute.manager [None req-7084db72-db42-4ad6-886f-a61cfd38145a - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:04:35 compute-1 podman[222486]: 2026-01-26 09:04:35.844962799 +0000 UTC m=+0.088367929 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:04:36 compute-1 nova_compute[183083]: 2026-01-26 09:04:36.521 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:37 compute-1 nova_compute[183083]: 2026-01-26 09:04:37.800 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:39 compute-1 sshd-session[222511]: Accepted publickey for zuul from 38.102.83.66 port 43694 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:04:39 compute-1 systemd-logind[788]: New session 75 of user zuul.
Jan 26 09:04:39 compute-1 systemd[1]: Started Session 75 of User zuul.
Jan 26 09:04:39 compute-1 sshd-session[222511]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:04:40 compute-1 sudo[222515]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.3CshN8EHUr
Jan 26 09:04:40 compute-1 sudo[222515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:04:40 compute-1 sudo[222515]: pam_unix(sudo:session): session closed for user root
Jan 26 09:04:41 compute-1 sshd-session[222541]: Accepted publickey for zuul from 38.102.83.66 port 43706 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:04:41 compute-1 systemd-logind[788]: New session 76 of user zuul.
Jan 26 09:04:41 compute-1 systemd[1]: Started Session 76 of User zuul.
Jan 26 09:04:41 compute-1 sshd-session[222541]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:04:41 compute-1 sshd-session[222545]: Accepted publickey for zuul from 38.102.83.66 port 43712 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:04:41 compute-1 systemd-logind[788]: New session 77 of user zuul.
Jan 26 09:04:41 compute-1 systemd[1]: Started Session 77 of User zuul.
Jan 26 09:04:41 compute-1 sshd-session[222545]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:04:41 compute-1 sudo[222549]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:04:41 compute-1 sudo[222549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:04:41 compute-1 sudo[222549]: pam_unix(sudo:session): session closed for user root
Jan 26 09:04:41 compute-1 nova_compute[183083]: 2026-01-26 09:04:41.523 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:41 compute-1 sudo[222574]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni eth1 icmp and ether host fa:16:3e:83:0d:53 -w /tmp/tmp.a0eVW96QuN
Jan 26 09:04:41 compute-1 sudo[222574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:04:41 compute-1 sshd-session[222548]: Connection closed by 38.102.83.66 port 43712
Jan 26 09:04:41 compute-1 sshd-session[222545]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:04:41 compute-1 systemd[1]: session-77.scope: Deactivated successfully.
Jan 26 09:04:41 compute-1 systemd-logind[788]: Session 77 logged out. Waiting for processes to exit.
Jan 26 09:04:41 compute-1 systemd-logind[788]: Removed session 77.
Jan 26 09:04:42 compute-1 nova_compute[183083]: 2026-01-26 09:04:42.802 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:46 compute-1 nova_compute[183083]: 2026-01-26 09:04:46.533 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:47 compute-1 nova_compute[183083]: 2026-01-26 09:04:47.803 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:50 compute-1 sshd-session[222600]: Accepted publickey for zuul from 38.102.83.66 port 54446 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:04:50 compute-1 systemd-logind[788]: New session 78 of user zuul.
Jan 26 09:04:50 compute-1 systemd[1]: Started Session 78 of User zuul.
Jan 26 09:04:50 compute-1 sshd-session[222600]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:04:50 compute-1 podman[222602]: 2026-01-26 09:04:50.674230942 +0000 UTC m=+0.093798500 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 26 09:04:50 compute-1 podman[222604]: 2026-01-26 09:04:50.678414609 +0000 UTC m=+0.096984740 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 26 09:04:50 compute-1 sudo[222635]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.a0eVW96QuN
Jan 26 09:04:50 compute-1 sudo[222635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:04:50 compute-1 sudo[222635]: pam_unix(sudo:session): session closed for user root
Jan 26 09:04:51 compute-1 nova_compute[183083]: 2026-01-26 09:04:51.535 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:51 compute-1 sshd-session[222668]: Accepted publickey for zuul from 38.102.83.66 port 54458 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:04:51 compute-1 systemd-logind[788]: New session 79 of user zuul.
Jan 26 09:04:51 compute-1 systemd[1]: Started Session 79 of User zuul.
Jan 26 09:04:51 compute-1 sshd-session[222668]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:04:51 compute-1 podman[222672]: 2026-01-26 09:04:51.978509075 +0000 UTC m=+0.092304889 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 09:04:51 compute-1 podman[222671]: 2026-01-26 09:04:51.98119213 +0000 UTC m=+0.109469538 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:04:52 compute-1 sshd-session[222694]: Accepted publickey for zuul from 38.102.83.66 port 54472 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:04:52 compute-1 systemd-logind[788]: New session 80 of user zuul.
Jan 26 09:04:52 compute-1 systemd[1]: Started Session 80 of User zuul.
Jan 26 09:04:52 compute-1 sshd-session[222694]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:04:52 compute-1 podman[222707]: 2026-01-26 09:04:52.124929304 +0000 UTC m=+0.140532956 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 26 09:04:52 compute-1 sudo[222746]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:04:52 compute-1 sudo[222746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:04:52 compute-1 sudo[222746]: pam_unix(sudo:session): session closed for user root
Jan 26 09:04:52 compute-1 sudo[222772]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni genev_sys_6081 icmp and ether host fa:16:3e:e0:2a:17 and ether host fa:16:3e:b0:5c:01 -w /tmp/tmp.MoyjUwK79l
Jan 26 09:04:52 compute-1 sudo[222772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:04:52 compute-1 sshd-session[222738]: Connection closed by 38.102.83.66 port 54472
Jan 26 09:04:52 compute-1 sshd-session[222694]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:04:52 compute-1 systemd[1]: session-80.scope: Deactivated successfully.
Jan 26 09:04:52 compute-1 systemd-logind[788]: Session 80 logged out. Waiting for processes to exit.
Jan 26 09:04:52 compute-1 systemd-logind[788]: Removed session 80.
Jan 26 09:04:52 compute-1 nova_compute[183083]: 2026-01-26 09:04:52.805 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:56 compute-1 nova_compute[183083]: 2026-01-26 09:04:56.538 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:04:57 compute-1 nova_compute[183083]: 2026-01-26 09:04:57.808 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:01 compute-1 sshd-session[222801]: Accepted publickey for zuul from 38.102.83.66 port 38992 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:05:01 compute-1 systemd-logind[788]: New session 81 of user zuul.
Jan 26 09:05:01 compute-1 systemd[1]: Started Session 81 of User zuul.
Jan 26 09:05:01 compute-1 sshd-session[222801]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:05:01 compute-1 sudo[222805]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.MoyjUwK79l
Jan 26 09:05:01 compute-1 sudo[222805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:05:01 compute-1 sudo[222805]: pam_unix(sudo:session): session closed for user root
Jan 26 09:05:01 compute-1 nova_compute[183083]: 2026-01-26 09:05:01.541 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:02 compute-1 nova_compute[183083]: 2026-01-26 09:05:02.810 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:05.319 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:05:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:05.320 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:05:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:05.321 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:05:06 compute-1 sshd-session[222831]: Accepted publickey for zuul from 38.102.83.66 port 60898 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:05:06 compute-1 systemd-logind[788]: New session 82 of user zuul.
Jan 26 09:05:06 compute-1 systemd[1]: Started Session 82 of User zuul.
Jan 26 09:05:06 compute-1 nova_compute[183083]: 2026-01-26 09:05:06.544 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:06 compute-1 sshd-session[222831]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:05:06 compute-1 podman[222833]: 2026-01-26 09:05:06.619563111 +0000 UTC m=+0.087049982 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:05:06 compute-1 sudo[222852]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.MoyjUwK79l
Jan 26 09:05:06 compute-1 sudo[222852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:05:06 compute-1 sudo[222852]: pam_unix(sudo:session): session closed for user root
Jan 26 09:05:06 compute-1 sshd-session[222843]: Connection closed by 38.102.83.66 port 60898
Jan 26 09:05:06 compute-1 sshd-session[222831]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:05:06 compute-1 systemd[1]: session-82.scope: Deactivated successfully.
Jan 26 09:05:06 compute-1 systemd-logind[788]: Session 82 logged out. Waiting for processes to exit.
Jan 26 09:05:06 compute-1 systemd-logind[788]: Removed session 82.
Jan 26 09:05:07 compute-1 nova_compute[183083]: 2026-01-26 09:05:07.814 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:10 compute-1 sudo[221616]: pam_unix(sudo:session): session closed for user root
Jan 26 09:05:11 compute-1 nova_compute[183083]: 2026-01-26 09:05:11.548 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:12 compute-1 nova_compute[183083]: 2026-01-26 09:05:12.817 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:13 compute-1 sshd-session[222886]: Accepted publickey for zuul from 38.102.83.66 port 44796 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:05:13 compute-1 systemd-logind[788]: New session 83 of user zuul.
Jan 26 09:05:13 compute-1 systemd[1]: Started Session 83 of User zuul.
Jan 26 09:05:13 compute-1 sshd-session[222886]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:05:13 compute-1 sudo[222890]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.a0eVW96QuN
Jan 26 09:05:13 compute-1 sudo[222890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:05:13 compute-1 sudo[222890]: pam_unix(sudo:session): session closed for user root
Jan 26 09:05:14 compute-1 sshd-session[222889]: Connection closed by 38.102.83.66 port 44796
Jan 26 09:05:14 compute-1 sshd-session[222886]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:05:14 compute-1 systemd[1]: session-83.scope: Deactivated successfully.
Jan 26 09:05:14 compute-1 systemd-logind[788]: Session 83 logged out. Waiting for processes to exit.
Jan 26 09:05:14 compute-1 systemd-logind[788]: Removed session 83.
Jan 26 09:05:15 compute-1 nova_compute[183083]: 2026-01-26 09:05:15.630 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:05:15 compute-1 nova_compute[183083]: 2026-01-26 09:05:15.631 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:05:15 compute-1 nova_compute[183083]: 2026-01-26 09:05:15.662 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: d11cfdba-71df-4d49-a60f-29397352c308] Skipping network cache update for instance because it has been migrated to another host. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9902
Jan 26 09:05:15 compute-1 nova_compute[183083]: 2026-01-26 09:05:15.663 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:05:15 compute-1 nova_compute[183083]: 2026-01-26 09:05:15.663 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:05:15 compute-1 nova_compute[183083]: 2026-01-26 09:05:15.979 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:05:16 compute-1 nova_compute[183083]: 2026-01-26 09:05:16.551 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:17 compute-1 nova_compute[183083]: 2026-01-26 09:05:17.866 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:17 compute-1 nova_compute[183083]: 2026-01-26 09:05:17.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:05:17 compute-1 nova_compute[183083]: 2026-01-26 09:05:17.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:05:19 compute-1 nova_compute[183083]: 2026-01-26 09:05:19.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:05:20 compute-1 ovn_controller[95352]: 2026-01-26T09:05:20Z|00283|pinctrl|WARN|Dropped 199 log messages in last 77 seconds (most recently, 22 seconds ago) due to excessive rate
Jan 26 09:05:20 compute-1 ovn_controller[95352]: 2026-01-26T09:05:20Z|00284|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:05:20 compute-1 podman[222917]: 2026-01-26 09:05:20.829816108 +0000 UTC m=+0.090078757 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, config_id=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350)
Jan 26 09:05:20 compute-1 podman[222916]: 2026-01-26 09:05:20.868682613 +0000 UTC m=+0.124254911 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:05:20 compute-1 nova_compute[183083]: 2026-01-26 09:05:20.946 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:05:20 compute-1 nova_compute[183083]: 2026-01-26 09:05:20.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:05:21 compute-1 sshd-session[222957]: Accepted publickey for zuul from 38.102.83.66 port 44802 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:05:21 compute-1 systemd-logind[788]: New session 84 of user zuul.
Jan 26 09:05:21 compute-1 systemd[1]: Started Session 84 of User zuul.
Jan 26 09:05:21 compute-1 sshd-session[222957]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:05:21 compute-1 sudo[221832]: pam_unix(sudo:session): session closed for user root
Jan 26 09:05:21 compute-1 sudo[222961]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.3CshN8EHUr
Jan 26 09:05:21 compute-1 sudo[222961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:05:21 compute-1 sudo[222961]: pam_unix(sudo:session): session closed for user root
Jan 26 09:05:21 compute-1 nova_compute[183083]: 2026-01-26 09:05:21.580 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:21 compute-1 sshd-session[222960]: Connection closed by 38.102.83.66 port 44802
Jan 26 09:05:21 compute-1 sshd-session[222957]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:05:21 compute-1 systemd[1]: session-84.scope: Deactivated successfully.
Jan 26 09:05:21 compute-1 systemd-logind[788]: Session 84 logged out. Waiting for processes to exit.
Jan 26 09:05:21 compute-1 systemd-logind[788]: Removed session 84.
Jan 26 09:05:21 compute-1 nova_compute[183083]: 2026-01-26 09:05:21.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:05:21 compute-1 nova_compute[183083]: 2026-01-26 09:05:21.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:05:22 compute-1 podman[222988]: 2026-01-26 09:05:22.827958229 +0000 UTC m=+0.084646445 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 26 09:05:22 compute-1 podman[222989]: 2026-01-26 09:05:22.832426874 +0000 UTC m=+0.083599146 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 09:05:22 compute-1 nova_compute[183083]: 2026-01-26 09:05:22.869 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:22 compute-1 podman[222987]: 2026-01-26 09:05:22.870214979 +0000 UTC m=+0.130095524 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 26 09:05:22 compute-1 nova_compute[183083]: 2026-01-26 09:05:22.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:05:22 compute-1 nova_compute[183083]: 2026-01-26 09:05:22.995 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:05:22 compute-1 nova_compute[183083]: 2026-01-26 09:05:22.996 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:05:22 compute-1 nova_compute[183083]: 2026-01-26 09:05:22.996 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:05:22 compute-1 nova_compute[183083]: 2026-01-26 09:05:22.996 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.092 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.194 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.195 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.254 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.418 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.420 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13514MB free_disk=113.0648422241211GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.420 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.420 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.488 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance bcd4a434-a1cf-402b-87c3-8d39bc284a82 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.488 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.489 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.504 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing inventories for resource provider 5203935e-446c-4e03-93fa-4c60d651e045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.522 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating ProviderTree inventory for provider 5203935e-446c-4e03-93fa-4c60d651e045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.523 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating inventory in ProviderTree for provider 5203935e-446c-4e03-93fa-4c60d651e045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.551 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing aggregate associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.574 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing trait associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.609 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.622 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.623 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:05:23 compute-1 nova_compute[183083]: 2026-01-26 09:05:23.623 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:05:26 compute-1 nova_compute[183083]: 2026-01-26 09:05:26.618 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:27 compute-1 nova_compute[183083]: 2026-01-26 09:05:27.916 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:28 compute-1 sshd-session[223060]: Accepted publickey for zuul from 38.102.83.66 port 59340 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:05:28 compute-1 systemd-logind[788]: New session 85 of user zuul.
Jan 26 09:05:28 compute-1 systemd[1]: Started Session 85 of User zuul.
Jan 26 09:05:28 compute-1 sshd-session[223060]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:05:28 compute-1 sudo[223064]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.f0ETVFuXUO
Jan 26 09:05:28 compute-1 sudo[223064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:05:28 compute-1 sudo[223064]: pam_unix(sudo:session): session closed for user root
Jan 26 09:05:28 compute-1 sshd-session[223063]: Connection closed by 38.102.83.66 port 59340
Jan 26 09:05:28 compute-1 sshd-session[223060]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:05:28 compute-1 systemd[1]: session-85.scope: Deactivated successfully.
Jan 26 09:05:28 compute-1 systemd-logind[788]: Session 85 logged out. Waiting for processes to exit.
Jan 26 09:05:28 compute-1 systemd-logind[788]: Removed session 85.
Jan 26 09:05:31 compute-1 nova_compute[183083]: 2026-01-26 09:05:31.665 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:32 compute-1 sudo[221921]: pam_unix(sudo:session): session closed for user root
Jan 26 09:05:32 compute-1 nova_compute[183083]: 2026-01-26 09:05:32.958 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:35 compute-1 sshd-session[223090]: Accepted publickey for zuul from 38.102.83.66 port 33588 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:05:35 compute-1 systemd-logind[788]: New session 86 of user zuul.
Jan 26 09:05:35 compute-1 systemd[1]: Started Session 86 of User zuul.
Jan 26 09:05:35 compute-1 sshd-session[223090]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:05:36 compute-1 sudo[223094]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.OzD50otnIU
Jan 26 09:05:36 compute-1 sudo[223094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:05:36 compute-1 sudo[223094]: pam_unix(sudo:session): session closed for user root
Jan 26 09:05:36 compute-1 sshd-session[223093]: Connection closed by 38.102.83.66 port 33588
Jan 26 09:05:36 compute-1 sshd-session[223090]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:05:36 compute-1 systemd[1]: session-86.scope: Deactivated successfully.
Jan 26 09:05:36 compute-1 systemd-logind[788]: Session 86 logged out. Waiting for processes to exit.
Jan 26 09:05:36 compute-1 systemd-logind[788]: Removed session 86.
Jan 26 09:05:36 compute-1 nova_compute[183083]: 2026-01-26 09:05:36.668 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:36 compute-1 podman[223120]: 2026-01-26 09:05:36.839842883 +0000 UTC m=+0.092182885 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 09:05:37 compute-1 nova_compute[183083]: 2026-01-26 09:05:37.961 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:41 compute-1 nova_compute[183083]: 2026-01-26 09:05:41.702 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:43 compute-1 nova_compute[183083]: 2026-01-26 09:05:43.008 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:43 compute-1 sshd-session[223144]: Accepted publickey for zuul from 38.102.83.66 port 33596 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:05:43 compute-1 systemd-logind[788]: New session 87 of user zuul.
Jan 26 09:05:43 compute-1 systemd[1]: Started Session 87 of User zuul.
Jan 26 09:05:43 compute-1 sshd-session[223144]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:05:43 compute-1 sudo[223148]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.uH6uMfn0Lv
Jan 26 09:05:43 compute-1 sudo[223148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:05:43 compute-1 sudo[223148]: pam_unix(sudo:session): session closed for user root
Jan 26 09:05:43 compute-1 sshd-session[223147]: Connection closed by 38.102.83.66 port 33596
Jan 26 09:05:43 compute-1 sshd-session[223144]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:05:43 compute-1 systemd[1]: session-87.scope: Deactivated successfully.
Jan 26 09:05:43 compute-1 systemd-logind[788]: Session 87 logged out. Waiting for processes to exit.
Jan 26 09:05:43 compute-1 systemd-logind[788]: Removed session 87.
Jan 26 09:05:46 compute-1 nova_compute[183083]: 2026-01-26 09:05:46.737 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:47 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:47.804 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:05:47 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:47.806 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:05:47 compute-1 nova_compute[183083]: 2026-01-26 09:05:47.848 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:48 compute-1 nova_compute[183083]: 2026-01-26 09:05:48.009 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:49 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:49.809 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.531 183087 DEBUG oslo_concurrency.lockutils [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "bcd4a434-a1cf-402b-87c3-8d39bc284a82" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.532 183087 DEBUG oslo_concurrency.lockutils [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "bcd4a434-a1cf-402b-87c3-8d39bc284a82" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.532 183087 DEBUG oslo_concurrency.lockutils [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "bcd4a434-a1cf-402b-87c3-8d39bc284a82-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.533 183087 DEBUG oslo_concurrency.lockutils [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "bcd4a434-a1cf-402b-87c3-8d39bc284a82-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.533 183087 DEBUG oslo_concurrency.lockutils [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "bcd4a434-a1cf-402b-87c3-8d39bc284a82-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.535 183087 INFO nova.compute.manager [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Terminating instance
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.537 183087 DEBUG nova.compute.manager [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 09:05:51 compute-1 kernel: tap896fca22-d4 (unregistering): left promiscuous mode
Jan 26 09:05:51 compute-1 NetworkManager[55451]: <info>  [1769418351.5766] device (tap896fca22-d4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.586 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:51 compute-1 ovn_controller[95352]: 2026-01-26T09:05:51Z|00285|binding|INFO|Releasing lport 896fca22-d4bb-4060-89c0-72ac1b8f6dd7 from this chassis (sb_readonly=0)
Jan 26 09:05:51 compute-1 ovn_controller[95352]: 2026-01-26T09:05:51Z|00286|binding|INFO|Setting lport 896fca22-d4bb-4060-89c0-72ac1b8f6dd7 down in Southbound
Jan 26 09:05:51 compute-1 ovn_controller[95352]: 2026-01-26T09:05:51Z|00287|binding|INFO|Removing iface tap896fca22-d4 ovn-installed in OVS
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.590 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:51.605 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:2a:17 10.100.0.21'], port_security=['fa:16:3e:e0:2a:17 10.100.0.21'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.21/28', 'neutron:device_id': 'bcd4a434-a1cf-402b-87c3-8d39bc284a82', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a7990128-24d3-4373-9624-cea49e6db86a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2580bb16c90849c4b5919eb271774a06', 'neutron:revision_number': '13', 'neutron:security_group_ids': '30ca606d-e02a-4090-8787-8ade3dd6e02d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6e8b8dd-7a7a-42f2-93d9-90eaac5d38d1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=896fca22-d4bb-4060-89c0-72ac1b8f6dd7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:05:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:51.607 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 896fca22-d4bb-4060-89c0-72ac1b8f6dd7 in datapath a7990128-24d3-4373-9624-cea49e6db86a unbound from our chassis
Jan 26 09:05:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:51.610 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a7990128-24d3-4373-9624-cea49e6db86a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 09:05:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:51.612 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[632c0553-0033-4fde-ab99-2e5648851050]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:05:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:51.613 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a namespace which is not needed anymore
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.622 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:51 compute-1 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000002f.scope: Deactivated successfully.
Jan 26 09:05:51 compute-1 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000002f.scope: Consumed 7.330s CPU time.
Jan 26 09:05:51 compute-1 systemd-machined[154360]: Machine qemu-17-instance-0000002f terminated.
Jan 26 09:05:51 compute-1 podman[223179]: 2026-01-26 09:05:51.708526447 +0000 UTC m=+0.087595847 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter)
Jan 26 09:05:51 compute-1 podman[223177]: 2026-01-26 09:05:51.739489272 +0000 UTC m=+0.118553762 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.739 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:51 compute-1 NetworkManager[55451]: <info>  [1769418351.7669] manager: (tap896fca22-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/99)
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.769 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:51 compute-1 neutron-haproxy-ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a[221568]: [NOTICE]   (221572) : haproxy version is 2.8.14-c23fe91
Jan 26 09:05:51 compute-1 neutron-haproxy-ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a[221568]: [NOTICE]   (221572) : path to executable is /usr/sbin/haproxy
Jan 26 09:05:51 compute-1 neutron-haproxy-ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a[221568]: [WARNING]  (221572) : Exiting Master process...
Jan 26 09:05:51 compute-1 neutron-haproxy-ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a[221568]: [ALERT]    (221572) : Current worker (221574) exited with code 143 (Terminated)
Jan 26 09:05:51 compute-1 neutron-haproxy-ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a[221568]: [WARNING]  (221572) : All workers exited. Exiting... (0)
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.774 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:51 compute-1 systemd[1]: libpod-da5df07fe6c3add65833c53811e185062c63eb02ff440018e2582eec7ce20cdf.scope: Deactivated successfully.
Jan 26 09:05:51 compute-1 podman[223233]: 2026-01-26 09:05:51.780695893 +0000 UTC m=+0.051354706 container died da5df07fe6c3add65833c53811e185062c63eb02ff440018e2582eec7ce20cdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:05:51 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-da5df07fe6c3add65833c53811e185062c63eb02ff440018e2582eec7ce20cdf-userdata-shm.mount: Deactivated successfully.
Jan 26 09:05:51 compute-1 systemd[1]: var-lib-containers-storage-overlay-4530cb42aafbdc3bc2386ae57bae3f0dd63c2ee41fcc33ac3497061bf32f4a1b-merged.mount: Deactivated successfully.
Jan 26 09:05:51 compute-1 podman[223233]: 2026-01-26 09:05:51.82321346 +0000 UTC m=+0.093872273 container cleanup da5df07fe6c3add65833c53811e185062c63eb02ff440018e2582eec7ce20cdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.825 183087 INFO nova.virt.libvirt.driver [-] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Instance destroyed successfully.
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.826 183087 DEBUG nova.objects.instance [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lazy-loading 'resources' on Instance uuid bcd4a434-a1cf-402b-87c3-8d39bc284a82 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:05:51 compute-1 systemd[1]: libpod-conmon-da5df07fe6c3add65833c53811e185062c63eb02ff440018e2582eec7ce20cdf.scope: Deactivated successfully.
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.844 183087 DEBUG nova.virt.libvirt.vif [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-26T09:02:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-1081491655',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1081491655',id=47,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCVUzFf6wSdyelY/UeY7cqTv8O0no1sfyU2c1QC2Iq0A42faUChCXJr9C1AIPerWETKRC47CWhTspqdq2Y/jF5MXoBNhFYSHEELzhDImF6CTLJqQZ+4txACe/VixshXUeg==',key_name='tempest-keypair-47432938',keypairs=<?>,launch_index=0,launched_at=2026-01-26T09:02:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2580bb16c90849c4b5919eb271774a06',ramdisk_id='',reservation_id='r-x2br1yxl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',clean_attempts='1',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-691788706',owner_user_name='tempest-OvnDvrTest-691788706-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T09:04:06Z,user_data=None,user_id='90104736f4ab4d81b09d1ff11e40f454',uuid=bcd4a434-a1cf-402b-87c3-8d39bc284a82,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "896fca22-d4bb-4060-89c0-72ac1b8f6dd7", "address": "fa:16:3e:e0:2a:17", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap896fca22-d4", "ovs_interfaceid": "896fca22-d4bb-4060-89c0-72ac1b8f6dd7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.845 183087 DEBUG nova.network.os_vif_util [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converting VIF {"id": "896fca22-d4bb-4060-89c0-72ac1b8f6dd7", "address": "fa:16:3e:e0:2a:17", "network": {"id": "a7990128-24d3-4373-9624-cea49e6db86a", "bridge": "br-int", "label": "tempest-test-network--1014603776", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap896fca22-d4", "ovs_interfaceid": "896fca22-d4bb-4060-89c0-72ac1b8f6dd7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.846 183087 DEBUG nova.network.os_vif_util [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e0:2a:17,bridge_name='br-int',has_traffic_filtering=True,id=896fca22-d4bb-4060-89c0-72ac1b8f6dd7,network=Network(a7990128-24d3-4373-9624-cea49e6db86a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap896fca22-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.846 183087 DEBUG os_vif [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e0:2a:17,bridge_name='br-int',has_traffic_filtering=True,id=896fca22-d4bb-4060-89c0-72ac1b8f6dd7,network=Network(a7990128-24d3-4373-9624-cea49e6db86a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap896fca22-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.847 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.848 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap896fca22-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.849 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.851 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.854 183087 INFO os_vif [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e0:2a:17,bridge_name='br-int',has_traffic_filtering=True,id=896fca22-d4bb-4060-89c0-72ac1b8f6dd7,network=Network(a7990128-24d3-4373-9624-cea49e6db86a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap896fca22-d4')
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.854 183087 INFO nova.virt.libvirt.driver [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Deleting instance files /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82_del
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.855 183087 INFO nova.virt.libvirt.driver [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Deletion of /var/lib/nova/instances/bcd4a434-a1cf-402b-87c3-8d39bc284a82_del complete
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.899 183087 INFO nova.compute.manager [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Took 0.36 seconds to destroy the instance on the hypervisor.
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.899 183087 DEBUG oslo.service.loopingcall [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.900 183087 DEBUG nova.compute.manager [-] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.900 183087 DEBUG nova.network.neutron [-] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 09:05:51 compute-1 podman[223277]: 2026-01-26 09:05:51.90164698 +0000 UTC m=+0.052767044 container remove da5df07fe6c3add65833c53811e185062c63eb02ff440018e2582eec7ce20cdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 26 09:05:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:51.910 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[f40242ac-0529-471e-9a59-9cdf2e9406af]: (4, ('Mon Jan 26 09:05:51 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a (da5df07fe6c3add65833c53811e185062c63eb02ff440018e2582eec7ce20cdf)\nda5df07fe6c3add65833c53811e185062c63eb02ff440018e2582eec7ce20cdf\nMon Jan 26 09:05:51 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a (da5df07fe6c3add65833c53811e185062c63eb02ff440018e2582eec7ce20cdf)\nda5df07fe6c3add65833c53811e185062c63eb02ff440018e2582eec7ce20cdf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:05:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:51.912 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[af5e94f3-c882-4b07-9b5b-d89be7187221]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:05:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:51.913 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa7990128-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:05:51 compute-1 kernel: tapa7990128-20: left promiscuous mode
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.915 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:51 compute-1 nova_compute[183083]: 2026-01-26 09:05:51.939 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:51.943 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[c03a2563-e74e-475b-b422-430e29a5967f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:05:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:51.956 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[53bb3e21-52d9-4974-b640-eb9b93354902]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:05:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:51.957 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[f888ad46-9904-4a2e-b1f1-7206d65a9686]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:05:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:51.973 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[af1dc170-635d-4c22-aff5-265091bbc3be]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452370, 'reachable_time': 19446, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223294, 'error': None, 'target': 'ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:05:51 compute-1 systemd[1]: run-netns-ovnmeta\x2da7990128\x2d24d3\x2d4373\x2d9624\x2dcea49e6db86a.mount: Deactivated successfully.
Jan 26 09:05:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:51.976 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a7990128-24d3-4373-9624-cea49e6db86a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 09:05:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:05:51.976 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9edf55-0d97-41e7-bc93-5e0ec60d61e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:05:52 compute-1 nova_compute[183083]: 2026-01-26 09:05:52.290 183087 DEBUG nova.compute.manager [req-0cb9b21d-562e-4133-8cd8-e0b0eef4ee3f req-453353c5-80d9-4b2d-a73b-726db82a83d9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Received event network-changed-896fca22-d4bb-4060-89c0-72ac1b8f6dd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:05:52 compute-1 nova_compute[183083]: 2026-01-26 09:05:52.291 183087 DEBUG nova.compute.manager [req-0cb9b21d-562e-4133-8cd8-e0b0eef4ee3f req-453353c5-80d9-4b2d-a73b-726db82a83d9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Refreshing instance network info cache due to event network-changed-896fca22-d4bb-4060-89c0-72ac1b8f6dd7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:05:52 compute-1 nova_compute[183083]: 2026-01-26 09:05:52.291 183087 DEBUG oslo_concurrency.lockutils [req-0cb9b21d-562e-4133-8cd8-e0b0eef4ee3f req-453353c5-80d9-4b2d-a73b-726db82a83d9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-bcd4a434-a1cf-402b-87c3-8d39bc284a82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:05:52 compute-1 nova_compute[183083]: 2026-01-26 09:05:52.292 183087 DEBUG oslo_concurrency.lockutils [req-0cb9b21d-562e-4133-8cd8-e0b0eef4ee3f req-453353c5-80d9-4b2d-a73b-726db82a83d9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-bcd4a434-a1cf-402b-87c3-8d39bc284a82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:05:52 compute-1 nova_compute[183083]: 2026-01-26 09:05:52.292 183087 DEBUG nova.network.neutron [req-0cb9b21d-562e-4133-8cd8-e0b0eef4ee3f req-453353c5-80d9-4b2d-a73b-726db82a83d9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Refreshing network info cache for port 896fca22-d4bb-4060-89c0-72ac1b8f6dd7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:05:52 compute-1 nova_compute[183083]: 2026-01-26 09:05:52.445 183087 DEBUG nova.network.neutron [-] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:05:52 compute-1 nova_compute[183083]: 2026-01-26 09:05:52.457 183087 INFO nova.compute.manager [-] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Took 0.56 seconds to deallocate network for instance.
Jan 26 09:05:52 compute-1 nova_compute[183083]: 2026-01-26 09:05:52.484 183087 INFO nova.network.neutron [req-0cb9b21d-562e-4133-8cd8-e0b0eef4ee3f req-453353c5-80d9-4b2d-a73b-726db82a83d9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Port 896fca22-d4bb-4060-89c0-72ac1b8f6dd7 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 26 09:05:52 compute-1 nova_compute[183083]: 2026-01-26 09:05:52.484 183087 DEBUG nova.network.neutron [req-0cb9b21d-562e-4133-8cd8-e0b0eef4ee3f req-453353c5-80d9-4b2d-a73b-726db82a83d9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:05:52 compute-1 nova_compute[183083]: 2026-01-26 09:05:52.499 183087 DEBUG oslo_concurrency.lockutils [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:05:52 compute-1 nova_compute[183083]: 2026-01-26 09:05:52.500 183087 DEBUG oslo_concurrency.lockutils [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:05:52 compute-1 nova_compute[183083]: 2026-01-26 09:05:52.502 183087 DEBUG oslo_concurrency.lockutils [req-0cb9b21d-562e-4133-8cd8-e0b0eef4ee3f req-453353c5-80d9-4b2d-a73b-726db82a83d9 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-bcd4a434-a1cf-402b-87c3-8d39bc284a82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:05:52 compute-1 nova_compute[183083]: 2026-01-26 09:05:52.568 183087 DEBUG nova.compute.provider_tree [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:05:52 compute-1 nova_compute[183083]: 2026-01-26 09:05:52.586 183087 DEBUG nova.scheduler.client.report [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:05:52 compute-1 nova_compute[183083]: 2026-01-26 09:05:52.609 183087 DEBUG oslo_concurrency.lockutils [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:05:52 compute-1 nova_compute[183083]: 2026-01-26 09:05:52.650 183087 INFO nova.scheduler.client.report [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Deleted allocations for instance bcd4a434-a1cf-402b-87c3-8d39bc284a82
Jan 26 09:05:52 compute-1 nova_compute[183083]: 2026-01-26 09:05:52.710 183087 DEBUG oslo_concurrency.lockutils [None req-5629f1b2-e79c-4725-ba78-bbdd286966b3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "bcd4a434-a1cf-402b-87c3-8d39bc284a82" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:05:53 compute-1 nova_compute[183083]: 2026-01-26 09:05:53.012 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:53 compute-1 nova_compute[183083]: 2026-01-26 09:05:53.647 183087 DEBUG nova.compute.manager [req-e8e916b8-4648-4289-acb6-6e5ce8f4fd17 req-7816d305-5167-47bc-9170-854392e53f17 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Received event network-vif-unplugged-896fca22-d4bb-4060-89c0-72ac1b8f6dd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:05:53 compute-1 nova_compute[183083]: 2026-01-26 09:05:53.648 183087 DEBUG oslo_concurrency.lockutils [req-e8e916b8-4648-4289-acb6-6e5ce8f4fd17 req-7816d305-5167-47bc-9170-854392e53f17 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "bcd4a434-a1cf-402b-87c3-8d39bc284a82-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:05:53 compute-1 nova_compute[183083]: 2026-01-26 09:05:53.649 183087 DEBUG oslo_concurrency.lockutils [req-e8e916b8-4648-4289-acb6-6e5ce8f4fd17 req-7816d305-5167-47bc-9170-854392e53f17 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "bcd4a434-a1cf-402b-87c3-8d39bc284a82-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:05:53 compute-1 nova_compute[183083]: 2026-01-26 09:05:53.650 183087 DEBUG oslo_concurrency.lockutils [req-e8e916b8-4648-4289-acb6-6e5ce8f4fd17 req-7816d305-5167-47bc-9170-854392e53f17 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "bcd4a434-a1cf-402b-87c3-8d39bc284a82-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:05:53 compute-1 nova_compute[183083]: 2026-01-26 09:05:53.650 183087 DEBUG nova.compute.manager [req-e8e916b8-4648-4289-acb6-6e5ce8f4fd17 req-7816d305-5167-47bc-9170-854392e53f17 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] No waiting events found dispatching network-vif-unplugged-896fca22-d4bb-4060-89c0-72ac1b8f6dd7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:05:53 compute-1 nova_compute[183083]: 2026-01-26 09:05:53.651 183087 WARNING nova.compute.manager [req-e8e916b8-4648-4289-acb6-6e5ce8f4fd17 req-7816d305-5167-47bc-9170-854392e53f17 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Received unexpected event network-vif-unplugged-896fca22-d4bb-4060-89c0-72ac1b8f6dd7 for instance with vm_state deleted and task_state None.
Jan 26 09:05:53 compute-1 nova_compute[183083]: 2026-01-26 09:05:53.651 183087 DEBUG nova.compute.manager [req-e8e916b8-4648-4289-acb6-6e5ce8f4fd17 req-7816d305-5167-47bc-9170-854392e53f17 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Received event network-vif-plugged-896fca22-d4bb-4060-89c0-72ac1b8f6dd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:05:53 compute-1 nova_compute[183083]: 2026-01-26 09:05:53.652 183087 DEBUG oslo_concurrency.lockutils [req-e8e916b8-4648-4289-acb6-6e5ce8f4fd17 req-7816d305-5167-47bc-9170-854392e53f17 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "bcd4a434-a1cf-402b-87c3-8d39bc284a82-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:05:53 compute-1 nova_compute[183083]: 2026-01-26 09:05:53.652 183087 DEBUG oslo_concurrency.lockutils [req-e8e916b8-4648-4289-acb6-6e5ce8f4fd17 req-7816d305-5167-47bc-9170-854392e53f17 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "bcd4a434-a1cf-402b-87c3-8d39bc284a82-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:05:53 compute-1 nova_compute[183083]: 2026-01-26 09:05:53.652 183087 DEBUG oslo_concurrency.lockutils [req-e8e916b8-4648-4289-acb6-6e5ce8f4fd17 req-7816d305-5167-47bc-9170-854392e53f17 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "bcd4a434-a1cf-402b-87c3-8d39bc284a82-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:05:53 compute-1 nova_compute[183083]: 2026-01-26 09:05:53.653 183087 DEBUG nova.compute.manager [req-e8e916b8-4648-4289-acb6-6e5ce8f4fd17 req-7816d305-5167-47bc-9170-854392e53f17 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] No waiting events found dispatching network-vif-plugged-896fca22-d4bb-4060-89c0-72ac1b8f6dd7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:05:53 compute-1 nova_compute[183083]: 2026-01-26 09:05:53.653 183087 WARNING nova.compute.manager [req-e8e916b8-4648-4289-acb6-6e5ce8f4fd17 req-7816d305-5167-47bc-9170-854392e53f17 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Received unexpected event network-vif-plugged-896fca22-d4bb-4060-89c0-72ac1b8f6dd7 for instance with vm_state deleted and task_state None.
Jan 26 09:05:53 compute-1 podman[223296]: 2026-01-26 09:05:53.845750212 +0000 UTC m=+0.102099102 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 09:05:53 compute-1 podman[223297]: 2026-01-26 09:05:53.850472844 +0000 UTC m=+0.093259116 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:05:53 compute-1 podman[223295]: 2026-01-26 09:05:53.878854786 +0000 UTC m=+0.130688030 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Jan 26 09:05:54 compute-1 nova_compute[183083]: 2026-01-26 09:05:54.412 183087 DEBUG nova.compute.manager [req-1e9d28bf-b2d6-4301-95db-37adca6af7c1 req-c8cd9c17-496a-4269-87da-033386dbcb63 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Received event network-vif-deleted-896fca22-d4bb-4060-89c0-72ac1b8f6dd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:05:56 compute-1 nova_compute[183083]: 2026-01-26 09:05:56.851 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:57 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 26 09:05:57 compute-1 nova_compute[183083]: 2026-01-26 09:05:57.515 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:05:58 compute-1 nova_compute[183083]: 2026-01-26 09:05:58.014 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:01 compute-1 nova_compute[183083]: 2026-01-26 09:06:01.793 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:01 compute-1 nova_compute[183083]: 2026-01-26 09:06:01.853 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:03 compute-1 nova_compute[183083]: 2026-01-26 09:06:03.053 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.745 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:06:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:06:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:05.320 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:06:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:05.321 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:06:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:05.321 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:06:06 compute-1 sshd-session[223364]: Invalid user ethereum from 2.57.122.238 port 43486
Jan 26 09:06:06 compute-1 sshd-session[223364]: Connection closed by invalid user ethereum 2.57.122.238 port 43486 [preauth]
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.307 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Acquiring lock "6602f0c7-96fd-4c40-a71d-8909ba310a73" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.308 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.334 183087 DEBUG nova.compute.manager [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.427 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.428 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.440 183087 DEBUG nova.virt.hardware [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.441 183087 INFO nova.compute.claims [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Claim successful on node compute-1.ctlplane.example.com
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.544 183087 DEBUG nova.compute.provider_tree [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.558 183087 DEBUG nova.scheduler.client.report [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.583 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.583 183087 DEBUG nova.compute.manager [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.635 183087 DEBUG nova.compute.manager [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.635 183087 DEBUG nova.network.neutron [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.656 183087 INFO nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.675 183087 DEBUG nova.compute.manager [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.795 183087 DEBUG nova.compute.manager [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.798 183087 DEBUG nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.799 183087 INFO nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Creating image(s)
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.800 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Acquiring lock "/var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.800 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lock "/var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.802 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lock "/var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.826 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769418351.8145614, bcd4a434-a1cf-402b-87c3-8d39bc284a82 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.826 183087 INFO nova.compute.manager [-] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] VM Stopped (Lifecycle Event)
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.830 183087 DEBUG oslo_concurrency.processutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.861 183087 DEBUG nova.compute.manager [None req-bf8637d8-dab5-4917-b31f-8a027dbb49ed - - - - - -] [instance: bcd4a434-a1cf-402b-87c3-8d39bc284a82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.862 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.928 183087 DEBUG oslo_concurrency.processutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.929 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.930 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:06:06 compute-1 nova_compute[183083]: 2026-01-26 09:06:06.954 183087 DEBUG oslo_concurrency.processutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:06:07 compute-1 nova_compute[183083]: 2026-01-26 09:06:07.034 183087 DEBUG oslo_concurrency.processutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:06:07 compute-1 nova_compute[183083]: 2026-01-26 09:06:07.036 183087 DEBUG oslo_concurrency.processutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:06:07 compute-1 nova_compute[183083]: 2026-01-26 09:06:07.066 183087 DEBUG nova.policy [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f62d05840e2a48b2a2e3a2c53715fc82', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '21d2dd4efd74429aab05a84f55aaa4f9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 09:06:07 compute-1 nova_compute[183083]: 2026-01-26 09:06:07.089 183087 DEBUG oslo_concurrency.processutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:06:07 compute-1 nova_compute[183083]: 2026-01-26 09:06:07.090 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:06:07 compute-1 nova_compute[183083]: 2026-01-26 09:06:07.090 183087 DEBUG oslo_concurrency.processutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:06:07 compute-1 nova_compute[183083]: 2026-01-26 09:06:07.179 183087 DEBUG oslo_concurrency.processutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:06:07 compute-1 nova_compute[183083]: 2026-01-26 09:06:07.181 183087 DEBUG nova.virt.disk.api [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Checking if we can resize image /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 09:06:07 compute-1 nova_compute[183083]: 2026-01-26 09:06:07.182 183087 DEBUG oslo_concurrency.processutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:06:07 compute-1 nova_compute[183083]: 2026-01-26 09:06:07.247 183087 DEBUG oslo_concurrency.processutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:06:07 compute-1 nova_compute[183083]: 2026-01-26 09:06:07.249 183087 DEBUG nova.virt.disk.api [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Cannot resize image /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 09:06:07 compute-1 nova_compute[183083]: 2026-01-26 09:06:07.249 183087 DEBUG nova.objects.instance [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lazy-loading 'migration_context' on Instance uuid 6602f0c7-96fd-4c40-a71d-8909ba310a73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:06:07 compute-1 nova_compute[183083]: 2026-01-26 09:06:07.267 183087 DEBUG nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 09:06:07 compute-1 nova_compute[183083]: 2026-01-26 09:06:07.268 183087 DEBUG nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Ensure instance console log exists: /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 09:06:07 compute-1 nova_compute[183083]: 2026-01-26 09:06:07.268 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:06:07 compute-1 nova_compute[183083]: 2026-01-26 09:06:07.269 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:06:07 compute-1 nova_compute[183083]: 2026-01-26 09:06:07.269 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:06:07 compute-1 ovn_controller[95352]: 2026-01-26T09:06:07Z|00288|pinctrl|WARN|Dropped 581 log messages in last 47 seconds (most recently, 6 seconds ago) due to excessive rate
Jan 26 09:06:07 compute-1 ovn_controller[95352]: 2026-01-26T09:06:07Z|00289|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:06:07 compute-1 nova_compute[183083]: 2026-01-26 09:06:07.807 183087 DEBUG nova.network.neutron [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Successfully created port: cea8b6c2-596d-48d7-8c77-8fcc98c0b60b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 09:06:07 compute-1 podman[223381]: 2026-01-26 09:06:07.848685098 +0000 UTC m=+0.096714912 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 09:06:08 compute-1 nova_compute[183083]: 2026-01-26 09:06:08.056 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:08 compute-1 nova_compute[183083]: 2026-01-26 09:06:08.455 183087 DEBUG nova.network.neutron [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Successfully updated port: cea8b6c2-596d-48d7-8c77-8fcc98c0b60b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 09:06:08 compute-1 nova_compute[183083]: 2026-01-26 09:06:08.478 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Acquiring lock "refresh_cache-6602f0c7-96fd-4c40-a71d-8909ba310a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:06:08 compute-1 nova_compute[183083]: 2026-01-26 09:06:08.478 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Acquired lock "refresh_cache-6602f0c7-96fd-4c40-a71d-8909ba310a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:06:08 compute-1 nova_compute[183083]: 2026-01-26 09:06:08.478 183087 DEBUG nova.network.neutron [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 09:06:08 compute-1 nova_compute[183083]: 2026-01-26 09:06:08.561 183087 DEBUG nova.compute.manager [req-a0f08bd4-ed4e-4b75-823b-6cd20384b400 req-c69dece7-3382-41fe-a4f0-6acc220800a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Received event network-changed-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:06:08 compute-1 nova_compute[183083]: 2026-01-26 09:06:08.561 183087 DEBUG nova.compute.manager [req-a0f08bd4-ed4e-4b75-823b-6cd20384b400 req-c69dece7-3382-41fe-a4f0-6acc220800a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Refreshing instance network info cache due to event network-changed-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:06:08 compute-1 nova_compute[183083]: 2026-01-26 09:06:08.562 183087 DEBUG oslo_concurrency.lockutils [req-a0f08bd4-ed4e-4b75-823b-6cd20384b400 req-c69dece7-3382-41fe-a4f0-6acc220800a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-6602f0c7-96fd-4c40-a71d-8909ba310a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:06:08 compute-1 nova_compute[183083]: 2026-01-26 09:06:08.618 183087 DEBUG nova.network.neutron [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.342 183087 DEBUG nova.network.neutron [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Updating instance_info_cache with network_info: [{"id": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "address": "fa:16:3e:fd:4a:3f", "network": {"id": "a8b7bcc5-afbc-4c9d-9487-2698bb9a045f", "bridge": "br-int", "label": "tempest-test-network--1324066625", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea8b6c2-59", "ovs_interfaceid": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.361 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Releasing lock "refresh_cache-6602f0c7-96fd-4c40-a71d-8909ba310a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.361 183087 DEBUG nova.compute.manager [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Instance network_info: |[{"id": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "address": "fa:16:3e:fd:4a:3f", "network": {"id": "a8b7bcc5-afbc-4c9d-9487-2698bb9a045f", "bridge": "br-int", "label": "tempest-test-network--1324066625", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea8b6c2-59", "ovs_interfaceid": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.362 183087 DEBUG oslo_concurrency.lockutils [req-a0f08bd4-ed4e-4b75-823b-6cd20384b400 req-c69dece7-3382-41fe-a4f0-6acc220800a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-6602f0c7-96fd-4c40-a71d-8909ba310a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.363 183087 DEBUG nova.network.neutron [req-a0f08bd4-ed4e-4b75-823b-6cd20384b400 req-c69dece7-3382-41fe-a4f0-6acc220800a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Refreshing network info cache for port cea8b6c2-596d-48d7-8c77-8fcc98c0b60b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.367 183087 DEBUG nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Start _get_guest_xml network_info=[{"id": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "address": "fa:16:3e:fd:4a:3f", "network": {"id": "a8b7bcc5-afbc-4c9d-9487-2698bb9a045f", "bridge": "br-int", "label": "tempest-test-network--1324066625", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea8b6c2-59", "ovs_interfaceid": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encryption_options': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.375 183087 WARNING nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.386 183087 DEBUG nova.virt.libvirt.host [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.387 183087 DEBUG nova.virt.libvirt.host [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.390 183087 DEBUG nova.virt.libvirt.host [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.391 183087 DEBUG nova.virt.libvirt.host [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.392 183087 DEBUG nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.392 183087 DEBUG nova.virt.hardware [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T08:43:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7017323-cd35-470d-a718-faa6c6e97277',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.393 183087 DEBUG nova.virt.hardware [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.394 183087 DEBUG nova.virt.hardware [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.394 183087 DEBUG nova.virt.hardware [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.394 183087 DEBUG nova.virt.hardware [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.395 183087 DEBUG nova.virt.hardware [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.395 183087 DEBUG nova.virt.hardware [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.396 183087 DEBUG nova.virt.hardware [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.396 183087 DEBUG nova.virt.hardware [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.397 183087 DEBUG nova.virt.hardware [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.397 183087 DEBUG nova.virt.hardware [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.404 183087 DEBUG nova.virt.libvirt.vif [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T09:06:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-server-test-1740506536',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1740506536',id=49,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIFn3fPhxboXbgrEFzlTwh4C8Ll+n3HIgPrVu0g3HSyFNAN7PcvrywcLTWeUOpJlGU28isdDSO43TrUU7yocpXOGs6my4rqjK3p4DmJix6rCv9t8FEPa+QZ5iu6nhBpQaw==',key_name='tempest-keypair-test-421938435',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='21d2dd4efd74429aab05a84f55aaa4f9',ramdisk_id='',reservation_id='r-j426xds6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDvrTest-1505353326',owner_user_name='tempest-OvnDvrTest-1505353326-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:06:06Z,user_data=None,user_id='f62d05840e2a48b2a2e3a2c53715fc82',uuid=6602f0c7-96fd-4c40-a71d-8909ba310a73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "address": "fa:16:3e:fd:4a:3f", "network": {"id": "a8b7bcc5-afbc-4c9d-9487-2698bb9a045f", "bridge": "br-int", "label": "tempest-test-network--1324066625", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea8b6c2-59", "ovs_interfaceid": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.404 183087 DEBUG nova.network.os_vif_util [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Converting VIF {"id": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "address": "fa:16:3e:fd:4a:3f", "network": {"id": "a8b7bcc5-afbc-4c9d-9487-2698bb9a045f", "bridge": "br-int", "label": "tempest-test-network--1324066625", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea8b6c2-59", "ovs_interfaceid": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.406 183087 DEBUG nova.network.os_vif_util [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:4a:3f,bridge_name='br-int',has_traffic_filtering=True,id=cea8b6c2-596d-48d7-8c77-8fcc98c0b60b,network=Network(a8b7bcc5-afbc-4c9d-9487-2698bb9a045f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcea8b6c2-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.408 183087 DEBUG nova.objects.instance [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6602f0c7-96fd-4c40-a71d-8909ba310a73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.422 183087 DEBUG nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] End _get_guest_xml xml=<domain type="kvm">
Jan 26 09:06:09 compute-1 nova_compute[183083]:   <uuid>6602f0c7-96fd-4c40-a71d-8909ba310a73</uuid>
Jan 26 09:06:09 compute-1 nova_compute[183083]:   <name>instance-00000031</name>
Jan 26 09:06:09 compute-1 nova_compute[183083]:   <memory>131072</memory>
Jan 26 09:06:09 compute-1 nova_compute[183083]:   <vcpu>1</vcpu>
Jan 26 09:06:09 compute-1 nova_compute[183083]:   <metadata>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <nova:name>tempest-server-test-1740506536</nova:name>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <nova:creationTime>2026-01-26 09:06:09</nova:creationTime>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <nova:flavor name="m1.nano">
Jan 26 09:06:09 compute-1 nova_compute[183083]:         <nova:memory>128</nova:memory>
Jan 26 09:06:09 compute-1 nova_compute[183083]:         <nova:disk>1</nova:disk>
Jan 26 09:06:09 compute-1 nova_compute[183083]:         <nova:swap>0</nova:swap>
Jan 26 09:06:09 compute-1 nova_compute[183083]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 09:06:09 compute-1 nova_compute[183083]:         <nova:vcpus>1</nova:vcpus>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       </nova:flavor>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <nova:owner>
Jan 26 09:06:09 compute-1 nova_compute[183083]:         <nova:user uuid="f62d05840e2a48b2a2e3a2c53715fc82">tempest-OvnDvrTest-1505353326-project-member</nova:user>
Jan 26 09:06:09 compute-1 nova_compute[183083]:         <nova:project uuid="21d2dd4efd74429aab05a84f55aaa4f9">tempest-OvnDvrTest-1505353326</nova:project>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       </nova:owner>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <nova:root type="image" uuid="13d1a20a-8003-4f19-aba7-ccbd9eff9b82"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <nova:ports>
Jan 26 09:06:09 compute-1 nova_compute[183083]:         <nova:port uuid="cea8b6c2-596d-48d7-8c77-8fcc98c0b60b">
Jan 26 09:06:09 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="10.100.0.45" ipVersion="4"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       </nova:ports>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     </nova:instance>
Jan 26 09:06:09 compute-1 nova_compute[183083]:   </metadata>
Jan 26 09:06:09 compute-1 nova_compute[183083]:   <sysinfo type="smbios">
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <system>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <entry name="manufacturer">RDO</entry>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <entry name="product">OpenStack Compute</entry>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <entry name="serial">6602f0c7-96fd-4c40-a71d-8909ba310a73</entry>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <entry name="uuid">6602f0c7-96fd-4c40-a71d-8909ba310a73</entry>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <entry name="family">Virtual Machine</entry>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     </system>
Jan 26 09:06:09 compute-1 nova_compute[183083]:   </sysinfo>
Jan 26 09:06:09 compute-1 nova_compute[183083]:   <os>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <boot dev="hd"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <smbios mode="sysinfo"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:   </os>
Jan 26 09:06:09 compute-1 nova_compute[183083]:   <features>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <acpi/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <apic/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <vmcoreinfo/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:   </features>
Jan 26 09:06:09 compute-1 nova_compute[183083]:   <clock offset="utc">
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <timer name="hpet" present="no"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:   </clock>
Jan 26 09:06:09 compute-1 nova_compute[183083]:   <cpu mode="host-model" match="exact">
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:   </cpu>
Jan 26 09:06:09 compute-1 nova_compute[183083]:   <devices>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <disk type="file" device="disk">
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <target dev="vda" bus="virtio"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     </disk>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <disk type="file" device="cdrom">
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk.config"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <target dev="sda" bus="sata"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     </disk>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:fd:4a:3f"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <target dev="tapcea8b6c2-59"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     </interface>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <serial type="pty">
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <log file="/var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/console.log" append="off"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     </serial>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <video>
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     </video>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <input type="tablet" bus="usb"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <rng model="virtio">
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <backend model="random">/dev/urandom</backend>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     </rng>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <controller type="usb" index="0"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     <memballoon model="virtio">
Jan 26 09:06:09 compute-1 nova_compute[183083]:       <stats period="10"/>
Jan 26 09:06:09 compute-1 nova_compute[183083]:     </memballoon>
Jan 26 09:06:09 compute-1 nova_compute[183083]:   </devices>
Jan 26 09:06:09 compute-1 nova_compute[183083]: </domain>
Jan 26 09:06:09 compute-1 nova_compute[183083]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.424 183087 DEBUG nova.compute.manager [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Preparing to wait for external event network-vif-plugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.424 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Acquiring lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.424 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.425 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.425 183087 DEBUG nova.virt.libvirt.vif [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T09:06:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-server-test-1740506536',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1740506536',id=49,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIFn3fPhxboXbgrEFzlTwh4C8Ll+n3HIgPrVu0g3HSyFNAN7PcvrywcLTWeUOpJlGU28isdDSO43TrUU7yocpXOGs6my4rqjK3p4DmJix6rCv9t8FEPa+QZ5iu6nhBpQaw==',key_name='tempest-keypair-test-421938435',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='21d2dd4efd74429aab05a84f55aaa4f9',ramdisk_id='',reservation_id='r-j426xds6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDvrTest-1505353326',owner_user_name='tempest-OvnDvrTest-1505353326-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:06:06Z,user_data=None,user_id='f62d05840e2a48b2a2e3a2c53715fc82',uuid=6602f0c7-96fd-4c40-a71d-8909ba310a73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "address": "fa:16:3e:fd:4a:3f", "network": {"id": "a8b7bcc5-afbc-4c9d-9487-2698bb9a045f", "bridge": "br-int", "label": "tempest-test-network--1324066625", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea8b6c2-59", "ovs_interfaceid": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.426 183087 DEBUG nova.network.os_vif_util [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Converting VIF {"id": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "address": "fa:16:3e:fd:4a:3f", "network": {"id": "a8b7bcc5-afbc-4c9d-9487-2698bb9a045f", "bridge": "br-int", "label": "tempest-test-network--1324066625", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea8b6c2-59", "ovs_interfaceid": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.426 183087 DEBUG nova.network.os_vif_util [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:4a:3f,bridge_name='br-int',has_traffic_filtering=True,id=cea8b6c2-596d-48d7-8c77-8fcc98c0b60b,network=Network(a8b7bcc5-afbc-4c9d-9487-2698bb9a045f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcea8b6c2-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.426 183087 DEBUG os_vif [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:4a:3f,bridge_name='br-int',has_traffic_filtering=True,id=cea8b6c2-596d-48d7-8c77-8fcc98c0b60b,network=Network(a8b7bcc5-afbc-4c9d-9487-2698bb9a045f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcea8b6c2-59') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.427 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.427 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.428 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.431 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.432 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcea8b6c2-59, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.432 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcea8b6c2-59, col_values=(('external_ids', {'iface-id': 'cea8b6c2-596d-48d7-8c77-8fcc98c0b60b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fd:4a:3f', 'vm-uuid': '6602f0c7-96fd-4c40-a71d-8909ba310a73'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.434 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:09 compute-1 NetworkManager[55451]: <info>  [1769418369.4359] manager: (tapcea8b6c2-59): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.437 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.444 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.446 183087 INFO os_vif [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:4a:3f,bridge_name='br-int',has_traffic_filtering=True,id=cea8b6c2-596d-48d7-8c77-8fcc98c0b60b,network=Network(a8b7bcc5-afbc-4c9d-9487-2698bb9a045f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcea8b6c2-59')
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.504 183087 DEBUG nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.506 183087 DEBUG nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.507 183087 DEBUG nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] No VIF found with MAC fa:16:3e:fd:4a:3f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 09:06:09 compute-1 nova_compute[183083]: 2026-01-26 09:06:09.507 183087 INFO nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Using config drive
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.001 183087 INFO nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Creating config drive at /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk.config
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.011 183087 DEBUG oslo_concurrency.processutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwnqfg7aq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.154 183087 DEBUG oslo_concurrency.processutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwnqfg7aq" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:06:10 compute-1 kernel: tapcea8b6c2-59: entered promiscuous mode
Jan 26 09:06:10 compute-1 NetworkManager[55451]: <info>  [1769418370.2466] manager: (tapcea8b6c2-59): new Tun device (/org/freedesktop/NetworkManager/Devices/101)
Jan 26 09:06:10 compute-1 ovn_controller[95352]: 2026-01-26T09:06:10Z|00290|binding|INFO|Claiming lport cea8b6c2-596d-48d7-8c77-8fcc98c0b60b for this chassis.
Jan 26 09:06:10 compute-1 ovn_controller[95352]: 2026-01-26T09:06:10Z|00291|binding|INFO|cea8b6c2-596d-48d7-8c77-8fcc98c0b60b: Claiming fa:16:3e:fd:4a:3f 10.100.0.45
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.293 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.305 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:4a:3f 10.100.0.45'], port_security=['fa:16:3e:fd:4a:3f 10.100.0.45'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.45/28', 'neutron:device_id': '6602f0c7-96fd-4c40-a71d-8909ba310a73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21d2dd4efd74429aab05a84f55aaa4f9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c736fd14-de19-447f-bed9-105e0da8c605', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79225713-54eb-4122-8bc5-04325a60d249, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=cea8b6c2-596d-48d7-8c77-8fcc98c0b60b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.306 104632 INFO neutron.agent.ovn.metadata.agent [-] Port cea8b6c2-596d-48d7-8c77-8fcc98c0b60b in datapath a8b7bcc5-afbc-4c9d-9487-2698bb9a045f bound to our chassis
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.307 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a8b7bcc5-afbc-4c9d-9487-2698bb9a045f
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.319 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[a88456df-e8c5-477c-b52b-f1dc995a1b82]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.320 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa8b7bcc5-a1 in ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.322 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa8b7bcc5-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.322 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[c4c93124-fe55-405e-b99b-b113d996a555]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:06:10 compute-1 ovn_controller[95352]: 2026-01-26T09:06:10Z|00292|binding|INFO|Setting lport cea8b6c2-596d-48d7-8c77-8fcc98c0b60b ovn-installed in OVS
Jan 26 09:06:10 compute-1 ovn_controller[95352]: 2026-01-26T09:06:10Z|00293|binding|INFO|Setting lport cea8b6c2-596d-48d7-8c77-8fcc98c0b60b up in Southbound
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.323 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[bca45948-af51-4325-bd4d-974b1f513ffa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.325 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:10 compute-1 systemd-udevd[223426]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.339 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[153cbf65-887f-4aaf-9ecc-c5301ad008e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:06:10 compute-1 systemd-machined[154360]: New machine qemu-18-instance-00000031.
Jan 26 09:06:10 compute-1 NetworkManager[55451]: <info>  [1769418370.3456] device (tapcea8b6c2-59): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 09:06:10 compute-1 NetworkManager[55451]: <info>  [1769418370.3462] device (tapcea8b6c2-59): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 09:06:10 compute-1 systemd[1]: Started Virtual Machine qemu-18-instance-00000031.
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.356 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[e0631f36-aafa-4cef-89e2-55d7878d7296]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.399 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[79b03e51-d1e7-4074-bf52-68ea8586f0dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:06:10 compute-1 NetworkManager[55451]: <info>  [1769418370.4082] manager: (tapa8b7bcc5-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/102)
Jan 26 09:06:10 compute-1 systemd-udevd[223430]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.409 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[edd7cf3a-5f96-48b5-8ffd-95ec780c2f24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.454 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[73d25be6-132a-44f9-b12a-ad25a0090bb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.459 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[85167eff-4731-4dab-87cb-c067d88c9ff9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:06:10 compute-1 NetworkManager[55451]: <info>  [1769418370.4905] device (tapa8b7bcc5-a0): carrier: link connected
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.497 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[e76b9c2e-d703-45a4-b477-b245e8a78c9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.522 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[72722971-81b1-4f1c-9c61-d99e3fecd0d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa8b7bcc5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:6e:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 471109, 'reachable_time': 42751, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223459, 'error': None, 'target': 'ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.542 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[f7109f35-89fa-487a-b8b7-0a10be05850c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8d:6e12'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 471109, 'tstamp': 471109}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223460, 'error': None, 'target': 'ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.561 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[c797806f-b566-414b-a3c6-49316f032283]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa8b7bcc5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:6e:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 471109, 'reachable_time': 42751, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223461, 'error': None, 'target': 'ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.602 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[5a3ce2ed-b365-4976-8522-f023e4dd7883]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.687 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[1fce97e5-0172-424a-ab89-a13a6dcc2dd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.689 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa8b7bcc5-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.689 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.690 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa8b7bcc5-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:06:10 compute-1 kernel: tapa8b7bcc5-a0: entered promiscuous mode
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.692 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:10 compute-1 NetworkManager[55451]: <info>  [1769418370.6937] manager: (tapa8b7bcc5-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.695 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.698 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa8b7bcc5-a0, col_values=(('external_ids', {'iface-id': '2c310ca3-b600-4c8b-9cab-ad29b2f630de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.699 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:10 compute-1 ovn_controller[95352]: 2026-01-26T09:06:10Z|00294|binding|INFO|Releasing lport 2c310ca3-b600-4c8b-9cab-ad29b2f630de from this chassis (sb_readonly=0)
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.701 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a8b7bcc5-afbc-4c9d-9487-2698bb9a045f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a8b7bcc5-afbc-4c9d-9487-2698bb9a045f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.702 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[2ea19695-88c5-44bd-893e-9a3e10c15f7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.703 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: global
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/a8b7bcc5-afbc-4c9d-9487-2698bb9a045f.pid.haproxy
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID a8b7bcc5-afbc-4c9d-9487-2698bb9a045f
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 09:06:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:10.704 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f', 'env', 'PROCESS_TAG=haproxy-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a8b7bcc5-afbc-4c9d-9487-2698bb9a045f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.713 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.718 183087 DEBUG nova.compute.manager [req-f5b02944-1970-487c-ab17-7e1791a17fec req-4f08bf20-703b-4bdb-83a0-cf3ebacaad32 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Received event network-vif-plugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.718 183087 DEBUG oslo_concurrency.lockutils [req-f5b02944-1970-487c-ab17-7e1791a17fec req-4f08bf20-703b-4bdb-83a0-cf3ebacaad32 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.719 183087 DEBUG oslo_concurrency.lockutils [req-f5b02944-1970-487c-ab17-7e1791a17fec req-4f08bf20-703b-4bdb-83a0-cf3ebacaad32 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.719 183087 DEBUG oslo_concurrency.lockutils [req-f5b02944-1970-487c-ab17-7e1791a17fec req-4f08bf20-703b-4bdb-83a0-cf3ebacaad32 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.719 183087 DEBUG nova.compute.manager [req-f5b02944-1970-487c-ab17-7e1791a17fec req-4f08bf20-703b-4bdb-83a0-cf3ebacaad32 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Processing event network-vif-plugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.789 183087 DEBUG nova.compute.manager [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.790 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769418370.7901907, 6602f0c7-96fd-4c40-a71d-8909ba310a73 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.790 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] VM Started (Lifecycle Event)
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.793 183087 DEBUG nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.796 183087 INFO nova.virt.libvirt.driver [-] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Instance spawned successfully.
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.796 183087 DEBUG nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.874 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.881 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.894 183087 DEBUG nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.894 183087 DEBUG nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.894 183087 DEBUG nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.894 183087 DEBUG nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.895 183087 DEBUG nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.895 183087 DEBUG nova.virt.libvirt.driver [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.920 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.920 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769418370.7919629, 6602f0c7-96fd-4c40-a71d-8909ba310a73 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.920 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] VM Paused (Lifecycle Event)
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.951 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.955 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769418370.7922993, 6602f0c7-96fd-4c40-a71d-8909ba310a73 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.955 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] VM Resumed (Lifecycle Event)
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.961 183087 INFO nova.compute.manager [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Took 4.17 seconds to spawn the instance on the hypervisor.
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.962 183087 DEBUG nova.compute.manager [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.971 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.974 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 09:06:10 compute-1 nova_compute[183083]: 2026-01-26 09:06:10.999 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 09:06:11 compute-1 nova_compute[183083]: 2026-01-26 09:06:11.017 183087 INFO nova.compute.manager [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Took 4.62 seconds to build instance.
Jan 26 09:06:11 compute-1 nova_compute[183083]: 2026-01-26 09:06:11.034 183087 DEBUG oslo_concurrency.lockutils [None req-043ce705-2f60-43d0-9f75-8a59d3543410 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.726s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:06:11 compute-1 podman[223500]: 2026-01-26 09:06:11.062986553 +0000 UTC m=+0.066980452 container create f8589881e79ac2638ea762377ad9e3ccdd9fd7e501202de7652c3c55ab07b9d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 09:06:11 compute-1 nova_compute[183083]: 2026-01-26 09:06:11.074 183087 DEBUG nova.network.neutron [req-a0f08bd4-ed4e-4b75-823b-6cd20384b400 req-c69dece7-3382-41fe-a4f0-6acc220800a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Updated VIF entry in instance network info cache for port cea8b6c2-596d-48d7-8c77-8fcc98c0b60b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:06:11 compute-1 nova_compute[183083]: 2026-01-26 09:06:11.075 183087 DEBUG nova.network.neutron [req-a0f08bd4-ed4e-4b75-823b-6cd20384b400 req-c69dece7-3382-41fe-a4f0-6acc220800a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Updating instance_info_cache with network_info: [{"id": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "address": "fa:16:3e:fd:4a:3f", "network": {"id": "a8b7bcc5-afbc-4c9d-9487-2698bb9a045f", "bridge": "br-int", "label": "tempest-test-network--1324066625", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea8b6c2-59", "ovs_interfaceid": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:06:11 compute-1 nova_compute[183083]: 2026-01-26 09:06:11.096 183087 DEBUG oslo_concurrency.lockutils [req-a0f08bd4-ed4e-4b75-823b-6cd20384b400 req-c69dece7-3382-41fe-a4f0-6acc220800a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-6602f0c7-96fd-4c40-a71d-8909ba310a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:06:11 compute-1 systemd[1]: Started libpod-conmon-f8589881e79ac2638ea762377ad9e3ccdd9fd7e501202de7652c3c55ab07b9d5.scope.
Jan 26 09:06:11 compute-1 podman[223500]: 2026-01-26 09:06:11.021640258 +0000 UTC m=+0.025634167 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 09:06:11 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:06:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8920cdb179c2717e5becd652e582fdd6cef4af4bcf85467454965ccbb5636459/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 09:06:11 compute-1 podman[223500]: 2026-01-26 09:06:11.155907868 +0000 UTC m=+0.159901877 container init f8589881e79ac2638ea762377ad9e3ccdd9fd7e501202de7652c3c55ab07b9d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 26 09:06:11 compute-1 podman[223500]: 2026-01-26 09:06:11.163548401 +0000 UTC m=+0.167542310 container start f8589881e79ac2638ea762377ad9e3ccdd9fd7e501202de7652c3c55ab07b9d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 26 09:06:11 compute-1 neutron-haproxy-ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f[223515]: [NOTICE]   (223519) : New worker (223521) forked
Jan 26 09:06:11 compute-1 neutron-haproxy-ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f[223515]: [NOTICE]   (223519) : Loading success.
Jan 26 09:06:12 compute-1 nova_compute[183083]: 2026-01-26 09:06:12.817 183087 DEBUG nova.compute.manager [req-f660e4f3-6d7f-4a84-a201-084941a4c885 req-5be82603-a711-4b61-8f73-78bec949e96c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Received event network-vif-plugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:06:12 compute-1 nova_compute[183083]: 2026-01-26 09:06:12.817 183087 DEBUG oslo_concurrency.lockutils [req-f660e4f3-6d7f-4a84-a201-084941a4c885 req-5be82603-a711-4b61-8f73-78bec949e96c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:06:12 compute-1 nova_compute[183083]: 2026-01-26 09:06:12.818 183087 DEBUG oslo_concurrency.lockutils [req-f660e4f3-6d7f-4a84-a201-084941a4c885 req-5be82603-a711-4b61-8f73-78bec949e96c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:06:12 compute-1 nova_compute[183083]: 2026-01-26 09:06:12.818 183087 DEBUG oslo_concurrency.lockutils [req-f660e4f3-6d7f-4a84-a201-084941a4c885 req-5be82603-a711-4b61-8f73-78bec949e96c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:06:12 compute-1 nova_compute[183083]: 2026-01-26 09:06:12.818 183087 DEBUG nova.compute.manager [req-f660e4f3-6d7f-4a84-a201-084941a4c885 req-5be82603-a711-4b61-8f73-78bec949e96c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] No waiting events found dispatching network-vif-plugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:06:12 compute-1 nova_compute[183083]: 2026-01-26 09:06:12.819 183087 WARNING nova.compute.manager [req-f660e4f3-6d7f-4a84-a201-084941a4c885 req-5be82603-a711-4b61-8f73-78bec949e96c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Received unexpected event network-vif-plugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b for instance with vm_state active and task_state None.
Jan 26 09:06:13 compute-1 nova_compute[183083]: 2026-01-26 09:06:13.059 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:14 compute-1 nova_compute[183083]: 2026-01-26 09:06:14.436 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:15 compute-1 nova_compute[183083]: 2026-01-26 09:06:15.624 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:06:15 compute-1 nova_compute[183083]: 2026-01-26 09:06:15.625 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:06:15 compute-1 nova_compute[183083]: 2026-01-26 09:06:15.625 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:06:16 compute-1 nova_compute[183083]: 2026-01-26 09:06:16.018 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "refresh_cache-6602f0c7-96fd-4c40-a71d-8909ba310a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:06:16 compute-1 nova_compute[183083]: 2026-01-26 09:06:16.019 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquired lock "refresh_cache-6602f0c7-96fd-4c40-a71d-8909ba310a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:06:16 compute-1 nova_compute[183083]: 2026-01-26 09:06:16.020 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 09:06:16 compute-1 nova_compute[183083]: 2026-01-26 09:06:16.020 183087 DEBUG nova.objects.instance [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6602f0c7-96fd-4c40-a71d-8909ba310a73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:06:17 compute-1 nova_compute[183083]: 2026-01-26 09:06:17.066 183087 DEBUG nova.compute.manager [req-759a9ee7-8387-4425-9359-a99362056e65 req-88bef2c9-e39b-4538-8c7a-3c17e31bb0c0 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Received event network-changed-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:06:17 compute-1 nova_compute[183083]: 2026-01-26 09:06:17.067 183087 DEBUG nova.compute.manager [req-759a9ee7-8387-4425-9359-a99362056e65 req-88bef2c9-e39b-4538-8c7a-3c17e31bb0c0 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Refreshing instance network info cache due to event network-changed-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:06:17 compute-1 nova_compute[183083]: 2026-01-26 09:06:17.068 183087 DEBUG oslo_concurrency.lockutils [req-759a9ee7-8387-4425-9359-a99362056e65 req-88bef2c9-e39b-4538-8c7a-3c17e31bb0c0 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-6602f0c7-96fd-4c40-a71d-8909ba310a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:06:18 compute-1 nova_compute[183083]: 2026-01-26 09:06:18.063 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:18 compute-1 nova_compute[183083]: 2026-01-26 09:06:18.181 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Updating instance_info_cache with network_info: [{"id": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "address": "fa:16:3e:fd:4a:3f", "network": {"id": "a8b7bcc5-afbc-4c9d-9487-2698bb9a045f", "bridge": "br-int", "label": "tempest-test-network--1324066625", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea8b6c2-59", "ovs_interfaceid": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:06:18 compute-1 nova_compute[183083]: 2026-01-26 09:06:18.199 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Releasing lock "refresh_cache-6602f0c7-96fd-4c40-a71d-8909ba310a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:06:18 compute-1 nova_compute[183083]: 2026-01-26 09:06:18.199 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 09:06:18 compute-1 nova_compute[183083]: 2026-01-26 09:06:18.200 183087 DEBUG oslo_concurrency.lockutils [req-759a9ee7-8387-4425-9359-a99362056e65 req-88bef2c9-e39b-4538-8c7a-3c17e31bb0c0 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-6602f0c7-96fd-4c40-a71d-8909ba310a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:06:18 compute-1 nova_compute[183083]: 2026-01-26 09:06:18.201 183087 DEBUG nova.network.neutron [req-759a9ee7-8387-4425-9359-a99362056e65 req-88bef2c9-e39b-4538-8c7a-3c17e31bb0c0 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Refreshing network info cache for port cea8b6c2-596d-48d7-8c77-8fcc98c0b60b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:06:18 compute-1 nova_compute[183083]: 2026-01-26 09:06:18.203 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:06:18 compute-1 sshd-session[223530]: Accepted publickey for zuul from 38.102.83.66 port 50876 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:06:18 compute-1 systemd-logind[788]: New session 88 of user zuul.
Jan 26 09:06:18 compute-1 systemd[1]: Started Session 88 of User zuul.
Jan 26 09:06:18 compute-1 sshd-session[223530]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:06:18 compute-1 sshd-session[223534]: Accepted publickey for zuul from 38.102.83.66 port 50880 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:06:18 compute-1 systemd-logind[788]: New session 89 of user zuul.
Jan 26 09:06:18 compute-1 systemd[1]: Started Session 89 of User zuul.
Jan 26 09:06:18 compute-1 sshd-session[223534]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:06:18 compute-1 sudo[223538]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:06:18 compute-1 sudo[223538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:06:18 compute-1 sudo[223538]: pam_unix(sudo:session): session closed for user root
Jan 26 09:06:18 compute-1 sudo[223563]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni eth1 icmp and ether host fa:16:3e:1d:21:45 -w /tmp/tmp.qKcjuMXd7h
Jan 26 09:06:18 compute-1 sudo[223563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:06:18 compute-1 sshd-session[223537]: Connection closed by 38.102.83.66 port 50880
Jan 26 09:06:18 compute-1 sshd-session[223534]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:06:18 compute-1 systemd[1]: session-89.scope: Deactivated successfully.
Jan 26 09:06:18 compute-1 systemd-logind[788]: Session 89 logged out. Waiting for processes to exit.
Jan 26 09:06:18 compute-1 systemd-logind[788]: Removed session 89.
Jan 26 09:06:19 compute-1 nova_compute[183083]: 2026-01-26 09:06:19.439 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:19 compute-1 nova_compute[183083]: 2026-01-26 09:06:19.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:06:19 compute-1 nova_compute[183083]: 2026-01-26 09:06:19.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:06:20 compute-1 nova_compute[183083]: 2026-01-26 09:06:20.002 183087 DEBUG nova.network.neutron [req-759a9ee7-8387-4425-9359-a99362056e65 req-88bef2c9-e39b-4538-8c7a-3c17e31bb0c0 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Updated VIF entry in instance network info cache for port cea8b6c2-596d-48d7-8c77-8fcc98c0b60b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:06:20 compute-1 nova_compute[183083]: 2026-01-26 09:06:20.003 183087 DEBUG nova.network.neutron [req-759a9ee7-8387-4425-9359-a99362056e65 req-88bef2c9-e39b-4538-8c7a-3c17e31bb0c0 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Updating instance_info_cache with network_info: [{"id": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "address": "fa:16:3e:fd:4a:3f", "network": {"id": "a8b7bcc5-afbc-4c9d-9487-2698bb9a045f", "bridge": "br-int", "label": "tempest-test-network--1324066625", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea8b6c2-59", "ovs_interfaceid": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:06:20 compute-1 nova_compute[183083]: 2026-01-26 09:06:20.017 183087 DEBUG oslo_concurrency.lockutils [req-759a9ee7-8387-4425-9359-a99362056e65 req-88bef2c9-e39b-4538-8c7a-3c17e31bb0c0 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-6602f0c7-96fd-4c40-a71d-8909ba310a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:06:20 compute-1 nova_compute[183083]: 2026-01-26 09:06:20.949 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:06:20 compute-1 nova_compute[183083]: 2026-01-26 09:06:20.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:06:21 compute-1 nova_compute[183083]: 2026-01-26 09:06:21.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:06:21 compute-1 nova_compute[183083]: 2026-01-26 09:06:21.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:06:21 compute-1 nova_compute[183083]: 2026-01-26 09:06:21.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:06:22 compute-1 ovn_controller[95352]: 2026-01-26T09:06:22Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fd:4a:3f 10.100.0.45
Jan 26 09:06:22 compute-1 ovn_controller[95352]: 2026-01-26T09:06:22Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fd:4a:3f 10.100.0.45
Jan 26 09:06:22 compute-1 podman[223598]: 2026-01-26 09:06:22.852731861 +0000 UTC m=+0.100818307 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 09:06:22 compute-1 podman[223597]: 2026-01-26 09:06:22.885672601 +0000 UTC m=+0.135144005 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 26 09:06:23 compute-1 nova_compute[183083]: 2026-01-26 09:06:23.065 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:23 compute-1 nova_compute[183083]: 2026-01-26 09:06:23.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:06:23 compute-1 nova_compute[183083]: 2026-01-26 09:06:23.983 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:06:23 compute-1 nova_compute[183083]: 2026-01-26 09:06:23.984 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:06:23 compute-1 nova_compute[183083]: 2026-01-26 09:06:23.984 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:06:23 compute-1 nova_compute[183083]: 2026-01-26 09:06:23.984 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:06:24 compute-1 nova_compute[183083]: 2026-01-26 09:06:24.073 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:06:24 compute-1 podman[223640]: 2026-01-26 09:06:24.125734311 +0000 UTC m=+0.073272467 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 09:06:24 compute-1 podman[223639]: 2026-01-26 09:06:24.127147611 +0000 UTC m=+0.080695575 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Jan 26 09:06:24 compute-1 nova_compute[183083]: 2026-01-26 09:06:24.158 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:06:24 compute-1 nova_compute[183083]: 2026-01-26 09:06:24.160 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:06:24 compute-1 podman[223638]: 2026-01-26 09:06:24.164809422 +0000 UTC m=+0.116440102 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:06:24 compute-1 nova_compute[183083]: 2026-01-26 09:06:24.249 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:06:24 compute-1 nova_compute[183083]: 2026-01-26 09:06:24.442 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:24 compute-1 nova_compute[183083]: 2026-01-26 09:06:24.448 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:06:24 compute-1 nova_compute[183083]: 2026-01-26 09:06:24.449 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13475MB free_disk=113.06489562988281GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:06:24 compute-1 nova_compute[183083]: 2026-01-26 09:06:24.449 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:06:24 compute-1 nova_compute[183083]: 2026-01-26 09:06:24.450 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:06:24 compute-1 nova_compute[183083]: 2026-01-26 09:06:24.516 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance 6602f0c7-96fd-4c40-a71d-8909ba310a73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 09:06:24 compute-1 nova_compute[183083]: 2026-01-26 09:06:24.517 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:06:24 compute-1 nova_compute[183083]: 2026-01-26 09:06:24.517 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:06:24 compute-1 nova_compute[183083]: 2026-01-26 09:06:24.560 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:06:24 compute-1 nova_compute[183083]: 2026-01-26 09:06:24.575 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:06:24 compute-1 nova_compute[183083]: 2026-01-26 09:06:24.596 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:06:24 compute-1 nova_compute[183083]: 2026-01-26 09:06:24.596 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:06:28 compute-1 nova_compute[183083]: 2026-01-26 09:06:28.068 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:29 compute-1 nova_compute[183083]: 2026-01-26 09:06:29.445 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:31 compute-1 sudo[222460]: pam_unix(sudo:session): session closed for user root
Jan 26 09:06:33 compute-1 nova_compute[183083]: 2026-01-26 09:06:33.117 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:34 compute-1 nova_compute[183083]: 2026-01-26 09:06:34.448 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:35 compute-1 sshd-session[223712]: Accepted publickey for zuul from 38.102.83.66 port 52818 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:06:35 compute-1 systemd-logind[788]: New session 90 of user zuul.
Jan 26 09:06:35 compute-1 systemd[1]: Started Session 90 of User zuul.
Jan 26 09:06:35 compute-1 sshd-session[223712]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:06:35 compute-1 sudo[223716]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.qKcjuMXd7h
Jan 26 09:06:35 compute-1 sudo[223716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:06:35 compute-1 sudo[223716]: pam_unix(sudo:session): session closed for user root
Jan 26 09:06:38 compute-1 nova_compute[183083]: 2026-01-26 09:06:38.122 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:38 compute-1 podman[223742]: 2026-01-26 09:06:38.827565812 +0000 UTC m=+0.085268942 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 09:06:39 compute-1 nova_compute[183083]: 2026-01-26 09:06:39.174 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:39 compute-1 nova_compute[183083]: 2026-01-26 09:06:39.450 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:41 compute-1 sudo[222574]: pam_unix(sudo:session): session closed for user root
Jan 26 09:06:43 compute-1 nova_compute[183083]: 2026-01-26 09:06:43.124 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:44 compute-1 nova_compute[183083]: 2026-01-26 09:06:44.453 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:48 compute-1 nova_compute[183083]: 2026-01-26 09:06:48.171 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:49 compute-1 nova_compute[183083]: 2026-01-26 09:06:49.456 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:49 compute-1 nova_compute[183083]: 2026-01-26 09:06:49.755 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:49 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:49.755 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:06:49 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:49.758 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:06:51 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:06:51.760 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:06:52 compute-1 sudo[222772]: pam_unix(sudo:session): session closed for user root
Jan 26 09:06:53 compute-1 nova_compute[183083]: 2026-01-26 09:06:53.172 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:53 compute-1 podman[223766]: 2026-01-26 09:06:53.831892781 +0000 UTC m=+0.084673375 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 26 09:06:53 compute-1 podman[223767]: 2026-01-26 09:06:53.837339163 +0000 UTC m=+0.085781146 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc.)
Jan 26 09:06:54 compute-1 nova_compute[183083]: 2026-01-26 09:06:54.461 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:54 compute-1 sshd-session[223807]: Accepted publickey for zuul from 38.102.83.66 port 43424 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:06:54 compute-1 systemd-logind[788]: New session 91 of user zuul.
Jan 26 09:06:54 compute-1 systemd[1]: Started Session 91 of User zuul.
Jan 26 09:06:54 compute-1 sshd-session[223807]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:06:54 compute-1 podman[223811]: 2026-01-26 09:06:54.627398887 +0000 UTC m=+0.106350481 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:06:54 compute-1 podman[223812]: 2026-01-26 09:06:54.634134705 +0000 UTC m=+0.105897478 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 09:06:54 compute-1 podman[223809]: 2026-01-26 09:06:54.654924716 +0000 UTC m=+0.146339148 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 09:06:54 compute-1 sshd-session[223863]: Accepted publickey for zuul from 38.102.83.66 port 43428 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:06:54 compute-1 systemd-logind[788]: New session 92 of user zuul.
Jan 26 09:06:54 compute-1 systemd[1]: Started Session 92 of User zuul.
Jan 26 09:06:54 compute-1 sshd-session[223863]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:06:54 compute-1 sudo[223882]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:06:54 compute-1 sudo[223882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:06:54 compute-1 sudo[223882]: pam_unix(sudo:session): session closed for user root
Jan 26 09:06:54 compute-1 sudo[223907]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni eth1 icmp and ether host fa:16:3e:45:d7:2f -w /tmp/tmp.QaGQB8Usyq
Jan 26 09:06:54 compute-1 sudo[223907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:06:54 compute-1 sshd-session[223881]: Connection closed by 38.102.83.66 port 43428
Jan 26 09:06:54 compute-1 sshd-session[223863]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:06:54 compute-1 systemd[1]: session-92.scope: Deactivated successfully.
Jan 26 09:06:54 compute-1 systemd-logind[788]: Session 92 logged out. Waiting for processes to exit.
Jan 26 09:06:54 compute-1 systemd-logind[788]: Removed session 92.
Jan 26 09:06:58 compute-1 nova_compute[183083]: 2026-01-26 09:06:58.174 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:06:58 compute-1 nova_compute[183083]: 2026-01-26 09:06:58.558 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:06:59 compute-1 nova_compute[183083]: 2026-01-26 09:06:59.464 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:03 compute-1 nova_compute[183083]: 2026-01-26 09:07:03.176 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:04 compute-1 nova_compute[183083]: 2026-01-26 09:07:04.467 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:05.321 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:07:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:05.322 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:07:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:05.323 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:07:05 compute-1 nova_compute[183083]: 2026-01-26 09:07:05.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:07:05 compute-1 nova_compute[183083]: 2026-01-26 09:07:05.953 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 26 09:07:05 compute-1 nova_compute[183083]: 2026-01-26 09:07:05.973 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 26 09:07:08 compute-1 nova_compute[183083]: 2026-01-26 09:07:08.177 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:09 compute-1 nova_compute[183083]: 2026-01-26 09:07:09.469 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:09 compute-1 podman[223940]: 2026-01-26 09:07:09.847372697 +0000 UTC m=+0.101681560 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:07:13 compute-1 nova_compute[183083]: 2026-01-26 09:07:13.188 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:14 compute-1 nova_compute[183083]: 2026-01-26 09:07:14.472 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:14 compute-1 nova_compute[183083]: 2026-01-26 09:07:14.548 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:07:14 compute-1 nova_compute[183083]: 2026-01-26 09:07:14.571 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Triggering sync for uuid 6602f0c7-96fd-4c40-a71d-8909ba310a73 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 26 09:07:14 compute-1 nova_compute[183083]: 2026-01-26 09:07:14.571 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "6602f0c7-96fd-4c40-a71d-8909ba310a73" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:07:14 compute-1 nova_compute[183083]: 2026-01-26 09:07:14.571 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:07:14 compute-1 nova_compute[183083]: 2026-01-26 09:07:14.596 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.025s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:07:15 compute-1 nova_compute[183083]: 2026-01-26 09:07:15.974 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:07:15 compute-1 nova_compute[183083]: 2026-01-26 09:07:15.974 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:07:15 compute-1 nova_compute[183083]: 2026-01-26 09:07:15.975 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:07:16 compute-1 nova_compute[183083]: 2026-01-26 09:07:16.146 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "refresh_cache-6602f0c7-96fd-4c40-a71d-8909ba310a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:07:16 compute-1 nova_compute[183083]: 2026-01-26 09:07:16.146 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquired lock "refresh_cache-6602f0c7-96fd-4c40-a71d-8909ba310a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:07:16 compute-1 nova_compute[183083]: 2026-01-26 09:07:16.146 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 09:07:16 compute-1 nova_compute[183083]: 2026-01-26 09:07:16.147 183087 DEBUG nova.objects.instance [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6602f0c7-96fd-4c40-a71d-8909ba310a73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:07:18 compute-1 nova_compute[183083]: 2026-01-26 09:07:18.189 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:18 compute-1 nova_compute[183083]: 2026-01-26 09:07:18.284 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Updating instance_info_cache with network_info: [{"id": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "address": "fa:16:3e:fd:4a:3f", "network": {"id": "a8b7bcc5-afbc-4c9d-9487-2698bb9a045f", "bridge": "br-int", "label": "tempest-test-network--1324066625", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea8b6c2-59", "ovs_interfaceid": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:07:18 compute-1 nova_compute[183083]: 2026-01-26 09:07:18.296 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Releasing lock "refresh_cache-6602f0c7-96fd-4c40-a71d-8909ba310a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:07:18 compute-1 nova_compute[183083]: 2026-01-26 09:07:18.297 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 09:07:18 compute-1 nova_compute[183083]: 2026-01-26 09:07:18.297 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:07:19 compute-1 nova_compute[183083]: 2026-01-26 09:07:19.475 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:20 compute-1 sshd-session[223964]: Accepted publickey for zuul from 38.102.83.66 port 43704 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:07:20 compute-1 systemd-logind[788]: New session 93 of user zuul.
Jan 26 09:07:20 compute-1 systemd[1]: Started Session 93 of User zuul.
Jan 26 09:07:20 compute-1 sshd-session[223964]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:07:20 compute-1 sudo[223968]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.QaGQB8Usyq
Jan 26 09:07:20 compute-1 sudo[223968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:07:20 compute-1 sudo[223968]: pam_unix(sudo:session): session closed for user root
Jan 26 09:07:21 compute-1 nova_compute[183083]: 2026-01-26 09:07:21.270 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:07:21 compute-1 sshd-session[223994]: Accepted publickey for zuul from 38.102.83.66 port 43708 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:07:21 compute-1 systemd-logind[788]: New session 94 of user zuul.
Jan 26 09:07:21 compute-1 systemd[1]: Started Session 94 of User zuul.
Jan 26 09:07:21 compute-1 sshd-session[223994]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:07:21 compute-1 sshd-session[223998]: Accepted publickey for zuul from 38.102.83.66 port 43714 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:07:21 compute-1 systemd-logind[788]: New session 95 of user zuul.
Jan 26 09:07:21 compute-1 systemd[1]: Started Session 95 of User zuul.
Jan 26 09:07:21 compute-1 sshd-session[223998]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:07:21 compute-1 nova_compute[183083]: 2026-01-26 09:07:21.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:07:21 compute-1 nova_compute[183083]: 2026-01-26 09:07:21.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:07:21 compute-1 nova_compute[183083]: 2026-01-26 09:07:21.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:07:21 compute-1 nova_compute[183083]: 2026-01-26 09:07:21.953 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:07:21 compute-1 sudo[224002]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:07:21 compute-1 sudo[224002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:07:22 compute-1 sudo[224002]: pam_unix(sudo:session): session closed for user root
Jan 26 09:07:22 compute-1 sudo[224027]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni genev_sys_6081 'icmp and ((ether host fa:16:3e:fd:4a:3f and ether host fa:16:3e:9a:ee:88) or (ether host fa:16:3e:d6:3f:ea and ether host fa:16:3e:b7:58:81))' -w /tmp/tmp.vGFvS4uNZ5
Jan 26 09:07:22 compute-1 sudo[224027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:07:22 compute-1 sshd-session[224001]: Connection closed by 38.102.83.66 port 43714
Jan 26 09:07:22 compute-1 sshd-session[223998]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:07:22 compute-1 systemd[1]: session-95.scope: Deactivated successfully.
Jan 26 09:07:22 compute-1 systemd-logind[788]: Session 95 logged out. Waiting for processes to exit.
Jan 26 09:07:22 compute-1 systemd-logind[788]: Removed session 95.
Jan 26 09:07:22 compute-1 ovn_controller[95352]: 2026-01-26T09:07:22Z|00295|pinctrl|WARN|Dropped 785 log messages in last 75 seconds (most recently, 24 seconds ago) due to excessive rate
Jan 26 09:07:22 compute-1 ovn_controller[95352]: 2026-01-26T09:07:22Z|00296|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:07:22 compute-1 ovn_controller[95352]: 2026-01-26T09:07:22Z|00297|memory_trim|INFO|Detected inactivity (last active 30026 ms ago): trimming memory
Jan 26 09:07:22 compute-1 nova_compute[183083]: 2026-01-26 09:07:22.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:07:22 compute-1 nova_compute[183083]: 2026-01-26 09:07:22.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:07:22 compute-1 nova_compute[183083]: 2026-01-26 09:07:22.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 26 09:07:23 compute-1 nova_compute[183083]: 2026-01-26 09:07:23.192 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:23 compute-1 nova_compute[183083]: 2026-01-26 09:07:23.968 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:07:23 compute-1 nova_compute[183083]: 2026-01-26 09:07:23.969 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:07:24 compute-1 nova_compute[183083]: 2026-01-26 09:07:24.477 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:24 compute-1 podman[224063]: 2026-01-26 09:07:24.854017826 +0000 UTC m=+0.074789580 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 09:07:24 compute-1 podman[224054]: 2026-01-26 09:07:24.858006537 +0000 UTC m=+0.095544319 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:07:24 compute-1 podman[224056]: 2026-01-26 09:07:24.858188982 +0000 UTC m=+0.085513419 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible)
Jan 26 09:07:24 compute-1 podman[224053]: 2026-01-26 09:07:24.889091575 +0000 UTC m=+0.132561413 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller)
Jan 26 09:07:24 compute-1 podman[224055]: 2026-01-26 09:07:24.889153127 +0000 UTC m=+0.121330989 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 09:07:25 compute-1 nova_compute[183083]: 2026-01-26 09:07:25.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:07:25 compute-1 nova_compute[183083]: 2026-01-26 09:07:25.982 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:07:25 compute-1 nova_compute[183083]: 2026-01-26 09:07:25.983 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:07:25 compute-1 nova_compute[183083]: 2026-01-26 09:07:25.984 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:07:25 compute-1 nova_compute[183083]: 2026-01-26 09:07:25.984 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:07:26 compute-1 nova_compute[183083]: 2026-01-26 09:07:26.086 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:07:26 compute-1 nova_compute[183083]: 2026-01-26 09:07:26.174 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:07:26 compute-1 nova_compute[183083]: 2026-01-26 09:07:26.176 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:07:26 compute-1 nova_compute[183083]: 2026-01-26 09:07:26.263 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:07:26 compute-1 nova_compute[183083]: 2026-01-26 09:07:26.454 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:07:26 compute-1 nova_compute[183083]: 2026-01-26 09:07:26.455 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13481MB free_disk=113.06398391723633GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:07:26 compute-1 nova_compute[183083]: 2026-01-26 09:07:26.456 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:07:26 compute-1 nova_compute[183083]: 2026-01-26 09:07:26.456 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:07:26 compute-1 nova_compute[183083]: 2026-01-26 09:07:26.578 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance 6602f0c7-96fd-4c40-a71d-8909ba310a73 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 09:07:26 compute-1 nova_compute[183083]: 2026-01-26 09:07:26.579 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:07:26 compute-1 nova_compute[183083]: 2026-01-26 09:07:26.579 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:07:26 compute-1 nova_compute[183083]: 2026-01-26 09:07:26.687 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:07:26 compute-1 nova_compute[183083]: 2026-01-26 09:07:26.703 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:07:26 compute-1 nova_compute[183083]: 2026-01-26 09:07:26.705 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:07:26 compute-1 nova_compute[183083]: 2026-01-26 09:07:26.706 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:07:27 compute-1 nova_compute[183083]: 2026-01-26 09:07:27.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:07:28 compute-1 nova_compute[183083]: 2026-01-26 09:07:28.196 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:29 compute-1 nova_compute[183083]: 2026-01-26 09:07:29.520 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:30 compute-1 sshd-session[224161]: Accepted publickey for zuul from 38.102.83.66 port 44610 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:07:30 compute-1 systemd-logind[788]: New session 96 of user zuul.
Jan 26 09:07:30 compute-1 systemd[1]: Started Session 96 of User zuul.
Jan 26 09:07:30 compute-1 sshd-session[224161]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:07:31 compute-1 sudo[224165]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.vGFvS4uNZ5
Jan 26 09:07:31 compute-1 sudo[224165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:07:31 compute-1 sudo[224165]: pam_unix(sudo:session): session closed for user root
Jan 26 09:07:33 compute-1 nova_compute[183083]: 2026-01-26 09:07:33.198 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:34 compute-1 nova_compute[183083]: 2026-01-26 09:07:34.523 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:35 compute-1 nova_compute[183083]: 2026-01-26 09:07:35.467 183087 DEBUG nova.virt.libvirt.driver [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Check if temp file /var/lib/nova/instances/tmpqw30isvi exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 26 09:07:35 compute-1 nova_compute[183083]: 2026-01-26 09:07:35.467 183087 DEBUG nova.compute.manager [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=114688,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqw30isvi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6602f0c7-96fd-4c40-a71d-8909ba310a73',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 26 09:07:37 compute-1 nova_compute[183083]: 2026-01-26 09:07:37.157 183087 DEBUG oslo_concurrency.processutils [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:07:37 compute-1 nova_compute[183083]: 2026-01-26 09:07:37.251 183087 DEBUG oslo_concurrency.processutils [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:07:37 compute-1 nova_compute[183083]: 2026-01-26 09:07:37.252 183087 DEBUG oslo_concurrency.processutils [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:07:37 compute-1 nova_compute[183083]: 2026-01-26 09:07:37.342 183087 DEBUG oslo_concurrency.processutils [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:07:38 compute-1 nova_compute[183083]: 2026-01-26 09:07:38.199 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:39 compute-1 nova_compute[183083]: 2026-01-26 09:07:39.526 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:40 compute-1 sshd-session[224197]: Accepted publickey for nova from 192.168.122.100 port 36010 ssh2: ECDSA SHA256:Yk+gMnjvz6H0Gj+fLzXKoFN6HWaghuhSmd0pdmuIFmU
Jan 26 09:07:40 compute-1 systemd-logind[788]: New session 97 of user nova.
Jan 26 09:07:40 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Jan 26 09:07:40 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 26 09:07:40 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 26 09:07:40 compute-1 systemd[1]: Starting User Manager for UID 42436...
Jan 26 09:07:40 compute-1 systemd[224216]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 26 09:07:40 compute-1 podman[224199]: 2026-01-26 09:07:40.37574422 +0000 UTC m=+0.103454550 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 09:07:40 compute-1 systemd[224216]: Queued start job for default target Main User Target.
Jan 26 09:07:40 compute-1 systemd[224216]: Created slice User Application Slice.
Jan 26 09:07:40 compute-1 systemd[224216]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 26 09:07:40 compute-1 systemd[224216]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 09:07:40 compute-1 systemd[224216]: Reached target Paths.
Jan 26 09:07:40 compute-1 systemd[224216]: Reached target Timers.
Jan 26 09:07:40 compute-1 systemd[224216]: Starting D-Bus User Message Bus Socket...
Jan 26 09:07:40 compute-1 systemd[224216]: Starting Create User's Volatile Files and Directories...
Jan 26 09:07:40 compute-1 systemd[224216]: Listening on D-Bus User Message Bus Socket.
Jan 26 09:07:40 compute-1 systemd[224216]: Reached target Sockets.
Jan 26 09:07:40 compute-1 systemd[224216]: Finished Create User's Volatile Files and Directories.
Jan 26 09:07:40 compute-1 systemd[224216]: Reached target Basic System.
Jan 26 09:07:40 compute-1 systemd[224216]: Reached target Main User Target.
Jan 26 09:07:40 compute-1 systemd[224216]: Startup finished in 146ms.
Jan 26 09:07:40 compute-1 systemd[1]: Started User Manager for UID 42436.
Jan 26 09:07:40 compute-1 systemd[1]: Started Session 97 of User nova.
Jan 26 09:07:40 compute-1 sshd-session[224197]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 26 09:07:40 compute-1 sshd-session[224240]: Received disconnect from 192.168.122.100 port 36010:11: disconnected by user
Jan 26 09:07:40 compute-1 sshd-session[224240]: Disconnected from user nova 192.168.122.100 port 36010
Jan 26 09:07:40 compute-1 sshd-session[224197]: pam_unix(sshd:session): session closed for user nova
Jan 26 09:07:40 compute-1 systemd[1]: session-97.scope: Deactivated successfully.
Jan 26 09:07:40 compute-1 systemd-logind[788]: Session 97 logged out. Waiting for processes to exit.
Jan 26 09:07:40 compute-1 systemd-logind[788]: Removed session 97.
Jan 26 09:07:41 compute-1 nova_compute[183083]: 2026-01-26 09:07:41.591 183087 DEBUG nova.compute.manager [req-bc6f2926-803a-4b00-8ee5-e3382a57f1aa req-15abf066-e88a-4ffa-8573-0f8672cc1577 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Received event network-vif-unplugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:07:41 compute-1 nova_compute[183083]: 2026-01-26 09:07:41.592 183087 DEBUG oslo_concurrency.lockutils [req-bc6f2926-803a-4b00-8ee5-e3382a57f1aa req-15abf066-e88a-4ffa-8573-0f8672cc1577 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:07:41 compute-1 nova_compute[183083]: 2026-01-26 09:07:41.593 183087 DEBUG oslo_concurrency.lockutils [req-bc6f2926-803a-4b00-8ee5-e3382a57f1aa req-15abf066-e88a-4ffa-8573-0f8672cc1577 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:07:41 compute-1 nova_compute[183083]: 2026-01-26 09:07:41.593 183087 DEBUG oslo_concurrency.lockutils [req-bc6f2926-803a-4b00-8ee5-e3382a57f1aa req-15abf066-e88a-4ffa-8573-0f8672cc1577 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:07:41 compute-1 nova_compute[183083]: 2026-01-26 09:07:41.594 183087 DEBUG nova.compute.manager [req-bc6f2926-803a-4b00-8ee5-e3382a57f1aa req-15abf066-e88a-4ffa-8573-0f8672cc1577 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] No waiting events found dispatching network-vif-unplugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:07:41 compute-1 nova_compute[183083]: 2026-01-26 09:07:41.595 183087 DEBUG nova.compute.manager [req-bc6f2926-803a-4b00-8ee5-e3382a57f1aa req-15abf066-e88a-4ffa-8573-0f8672cc1577 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Received event network-vif-unplugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 09:07:42 compute-1 nova_compute[183083]: 2026-01-26 09:07:42.426 183087 INFO nova.compute.manager [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Took 5.08 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Jan 26 09:07:42 compute-1 nova_compute[183083]: 2026-01-26 09:07:42.427 183087 DEBUG nova.compute.manager [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 09:07:42 compute-1 nova_compute[183083]: 2026-01-26 09:07:42.442 183087 DEBUG nova.compute.manager [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=114688,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqw30isvi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6602f0c7-96fd-4c40-a71d-8909ba310a73',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(169c411b-83a0-44b3-972f-3bbc4b44e558),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 26 09:07:42 compute-1 nova_compute[183083]: 2026-01-26 09:07:42.469 183087 DEBUG nova.objects.instance [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lazy-loading 'migration_context' on Instance uuid 6602f0c7-96fd-4c40-a71d-8909ba310a73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:07:42 compute-1 nova_compute[183083]: 2026-01-26 09:07:42.471 183087 DEBUG nova.virt.libvirt.driver [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 26 09:07:42 compute-1 nova_compute[183083]: 2026-01-26 09:07:42.474 183087 DEBUG nova.virt.libvirt.driver [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 26 09:07:42 compute-1 nova_compute[183083]: 2026-01-26 09:07:42.474 183087 DEBUG nova.virt.libvirt.driver [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 26 09:07:42 compute-1 nova_compute[183083]: 2026-01-26 09:07:42.489 183087 DEBUG nova.virt.libvirt.vif [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T09:06:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-1740506536',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1740506536',id=49,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIFn3fPhxboXbgrEFzlTwh4C8Ll+n3HIgPrVu0g3HSyFNAN7PcvrywcLTWeUOpJlGU28isdDSO43TrUU7yocpXOGs6my4rqjK3p4DmJix6rCv9t8FEPa+QZ5iu6nhBpQaw==',key_name='tempest-keypair-test-421938435',keypairs=<?>,launch_index=0,launched_at=2026-01-26T09:06:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='21d2dd4efd74429aab05a84f55aaa4f9',ramdisk_id='',reservation_id='r-j426xds6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-1505353326',owner_user_name='tempest-OvnDvrTest-1505353326-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T09:06:10Z,user_data=None,user_id='f62d05840e2a48b2a2e3a2c53715fc82',uuid=6602f0c7-96fd-4c40-a71d-8909ba310a73,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "address": "fa:16:3e:fd:4a:3f", "network": {"id": "a8b7bcc5-afbc-4c9d-9487-2698bb9a045f", "bridge": "br-int", "label": "tempest-test-network--1324066625", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapcea8b6c2-59", "ovs_interfaceid": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 09:07:42 compute-1 nova_compute[183083]: 2026-01-26 09:07:42.490 183087 DEBUG nova.network.os_vif_util [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converting VIF {"id": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "address": "fa:16:3e:fd:4a:3f", "network": {"id": "a8b7bcc5-afbc-4c9d-9487-2698bb9a045f", "bridge": "br-int", "label": "tempest-test-network--1324066625", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapcea8b6c2-59", "ovs_interfaceid": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:07:42 compute-1 nova_compute[183083]: 2026-01-26 09:07:42.491 183087 DEBUG nova.network.os_vif_util [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fd:4a:3f,bridge_name='br-int',has_traffic_filtering=True,id=cea8b6c2-596d-48d7-8c77-8fcc98c0b60b,network=Network(a8b7bcc5-afbc-4c9d-9487-2698bb9a045f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcea8b6c2-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:07:42 compute-1 nova_compute[183083]: 2026-01-26 09:07:42.492 183087 DEBUG nova.virt.libvirt.migration [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 09:07:42 compute-1 nova_compute[183083]:   <mac address="fa:16:3e:fd:4a:3f"/>
Jan 26 09:07:42 compute-1 nova_compute[183083]:   <model type="virtio"/>
Jan 26 09:07:42 compute-1 nova_compute[183083]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 09:07:42 compute-1 nova_compute[183083]:   <mtu size="1342"/>
Jan 26 09:07:42 compute-1 nova_compute[183083]:   <target dev="tapcea8b6c2-59"/>
Jan 26 09:07:42 compute-1 nova_compute[183083]: </interface>
Jan 26 09:07:42 compute-1 nova_compute[183083]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 26 09:07:42 compute-1 nova_compute[183083]: 2026-01-26 09:07:42.493 183087 DEBUG nova.virt.libvirt.driver [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 26 09:07:42 compute-1 nova_compute[183083]: 2026-01-26 09:07:42.977 183087 DEBUG nova.virt.libvirt.migration [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 26 09:07:42 compute-1 nova_compute[183083]: 2026-01-26 09:07:42.978 183087 INFO nova.virt.libvirt.migration [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 09:07:43 compute-1 nova_compute[183083]: 2026-01-26 09:07:43.046 183087 INFO nova.virt.libvirt.driver [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 09:07:43 compute-1 nova_compute[183083]: 2026-01-26 09:07:43.201 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:43 compute-1 nova_compute[183083]: 2026-01-26 09:07:43.550 183087 DEBUG nova.virt.libvirt.migration [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 26 09:07:43 compute-1 nova_compute[183083]: 2026-01-26 09:07:43.551 183087 DEBUG nova.virt.libvirt.migration [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 26 09:07:43 compute-1 nova_compute[183083]: 2026-01-26 09:07:43.679 183087 DEBUG nova.compute.manager [req-97be3fc1-eb9b-490f-9125-47ae344f32db req-2020ab86-6349-4705-b791-4751f722a554 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Received event network-vif-plugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:07:43 compute-1 nova_compute[183083]: 2026-01-26 09:07:43.679 183087 DEBUG oslo_concurrency.lockutils [req-97be3fc1-eb9b-490f-9125-47ae344f32db req-2020ab86-6349-4705-b791-4751f722a554 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:07:43 compute-1 nova_compute[183083]: 2026-01-26 09:07:43.680 183087 DEBUG oslo_concurrency.lockutils [req-97be3fc1-eb9b-490f-9125-47ae344f32db req-2020ab86-6349-4705-b791-4751f722a554 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:07:43 compute-1 nova_compute[183083]: 2026-01-26 09:07:43.680 183087 DEBUG oslo_concurrency.lockutils [req-97be3fc1-eb9b-490f-9125-47ae344f32db req-2020ab86-6349-4705-b791-4751f722a554 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:07:43 compute-1 nova_compute[183083]: 2026-01-26 09:07:43.681 183087 DEBUG nova.compute.manager [req-97be3fc1-eb9b-490f-9125-47ae344f32db req-2020ab86-6349-4705-b791-4751f722a554 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] No waiting events found dispatching network-vif-plugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:07:43 compute-1 nova_compute[183083]: 2026-01-26 09:07:43.681 183087 WARNING nova.compute.manager [req-97be3fc1-eb9b-490f-9125-47ae344f32db req-2020ab86-6349-4705-b791-4751f722a554 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Received unexpected event network-vif-plugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b for instance with vm_state active and task_state migrating.
Jan 26 09:07:43 compute-1 nova_compute[183083]: 2026-01-26 09:07:43.681 183087 DEBUG nova.compute.manager [req-97be3fc1-eb9b-490f-9125-47ae344f32db req-2020ab86-6349-4705-b791-4751f722a554 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Received event network-changed-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:07:43 compute-1 nova_compute[183083]: 2026-01-26 09:07:43.682 183087 DEBUG nova.compute.manager [req-97be3fc1-eb9b-490f-9125-47ae344f32db req-2020ab86-6349-4705-b791-4751f722a554 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Refreshing instance network info cache due to event network-changed-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:07:43 compute-1 nova_compute[183083]: 2026-01-26 09:07:43.682 183087 DEBUG oslo_concurrency.lockutils [req-97be3fc1-eb9b-490f-9125-47ae344f32db req-2020ab86-6349-4705-b791-4751f722a554 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-6602f0c7-96fd-4c40-a71d-8909ba310a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:07:43 compute-1 nova_compute[183083]: 2026-01-26 09:07:43.683 183087 DEBUG oslo_concurrency.lockutils [req-97be3fc1-eb9b-490f-9125-47ae344f32db req-2020ab86-6349-4705-b791-4751f722a554 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-6602f0c7-96fd-4c40-a71d-8909ba310a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:07:43 compute-1 nova_compute[183083]: 2026-01-26 09:07:43.683 183087 DEBUG nova.network.neutron [req-97be3fc1-eb9b-490f-9125-47ae344f32db req-2020ab86-6349-4705-b791-4751f722a554 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Refreshing network info cache for port cea8b6c2-596d-48d7-8c77-8fcc98c0b60b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.055 183087 DEBUG nova.virt.libvirt.migration [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.055 183087 DEBUG nova.virt.libvirt.migration [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.324 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769418464.3234963, 6602f0c7-96fd-4c40-a71d-8909ba310a73 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.324 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] VM Paused (Lifecycle Event)
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.408 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.414 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.441 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 26 09:07:44 compute-1 kernel: tapcea8b6c2-59 (unregistering): left promiscuous mode
Jan 26 09:07:44 compute-1 NetworkManager[55451]: <info>  [1769418464.4955] device (tapcea8b6c2-59): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 09:07:44 compute-1 ovn_controller[95352]: 2026-01-26T09:07:44Z|00298|binding|INFO|Releasing lport cea8b6c2-596d-48d7-8c77-8fcc98c0b60b from this chassis (sb_readonly=0)
Jan 26 09:07:44 compute-1 ovn_controller[95352]: 2026-01-26T09:07:44Z|00299|binding|INFO|Setting lport cea8b6c2-596d-48d7-8c77-8fcc98c0b60b down in Southbound
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.502 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:44 compute-1 ovn_controller[95352]: 2026-01-26T09:07:44Z|00300|binding|INFO|Removing iface tapcea8b6c2-59 ovn-installed in OVS
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.504 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.513 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:4a:3f 10.100.0.45'], port_security=['fa:16:3e:fd:4a:3f 10.100.0.45'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b62a0d99-b340-4a52-961e-b6a31b1ea8c8'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.45/28', 'neutron:device_id': '6602f0c7-96fd-4c40-a71d-8909ba310a73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21d2dd4efd74429aab05a84f55aaa4f9', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c736fd14-de19-447f-bed9-105e0da8c605', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.186'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79225713-54eb-4122-8bc5-04325a60d249, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=cea8b6c2-596d-48d7-8c77-8fcc98c0b60b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.516 104632 INFO neutron.agent.ovn.metadata.agent [-] Port cea8b6c2-596d-48d7-8c77-8fcc98c0b60b in datapath a8b7bcc5-afbc-4c9d-9487-2698bb9a045f unbound from our chassis
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.519 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a8b7bcc5-afbc-4c9d-9487-2698bb9a045f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.520 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[dfabc959-ff86-4f9b-a783-9ddbf57c6ed8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.521 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f namespace which is not needed anymore
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.528 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:44 compute-1 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000031.scope: Deactivated successfully.
Jan 26 09:07:44 compute-1 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000031.scope: Consumed 16.426s CPU time.
Jan 26 09:07:44 compute-1 systemd-machined[154360]: Machine qemu-18-instance-00000031 terminated.
Jan 26 09:07:44 compute-1 neutron-haproxy-ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f[223515]: [NOTICE]   (223519) : haproxy version is 2.8.14-c23fe91
Jan 26 09:07:44 compute-1 neutron-haproxy-ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f[223515]: [NOTICE]   (223519) : path to executable is /usr/sbin/haproxy
Jan 26 09:07:44 compute-1 neutron-haproxy-ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f[223515]: [WARNING]  (223519) : Exiting Master process...
Jan 26 09:07:44 compute-1 neutron-haproxy-ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f[223515]: [WARNING]  (223519) : Exiting Master process...
Jan 26 09:07:44 compute-1 neutron-haproxy-ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f[223515]: [ALERT]    (223519) : Current worker (223521) exited with code 143 (Terminated)
Jan 26 09:07:44 compute-1 neutron-haproxy-ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f[223515]: [WARNING]  (223519) : All workers exited. Exiting... (0)
Jan 26 09:07:44 compute-1 systemd[1]: libpod-f8589881e79ac2638ea762377ad9e3ccdd9fd7e501202de7652c3c55ab07b9d5.scope: Deactivated successfully.
Jan 26 09:07:44 compute-1 podman[224286]: 2026-01-26 09:07:44.682109953 +0000 UTC m=+0.053413343 container died f8589881e79ac2638ea762377ad9e3ccdd9fd7e501202de7652c3c55ab07b9d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 09:07:44 compute-1 kernel: tapcea8b6c2-59: entered promiscuous mode
Jan 26 09:07:44 compute-1 NetworkManager[55451]: <info>  [1769418464.6972] manager: (tapcea8b6c2-59): new Tun device (/org/freedesktop/NetworkManager/Devices/104)
Jan 26 09:07:44 compute-1 kernel: tapcea8b6c2-59 (unregistering): left promiscuous mode
Jan 26 09:07:44 compute-1 ovn_controller[95352]: 2026-01-26T09:07:44Z|00301|binding|INFO|Claiming lport cea8b6c2-596d-48d7-8c77-8fcc98c0b60b for this chassis.
Jan 26 09:07:44 compute-1 ovn_controller[95352]: 2026-01-26T09:07:44Z|00302|binding|INFO|cea8b6c2-596d-48d7-8c77-8fcc98c0b60b: Claiming fa:16:3e:fd:4a:3f 10.100.0.45
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.703 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.713 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:4a:3f 10.100.0.45'], port_security=['fa:16:3e:fd:4a:3f 10.100.0.45'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b62a0d99-b340-4a52-961e-b6a31b1ea8c8'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.45/28', 'neutron:device_id': '6602f0c7-96fd-4c40-a71d-8909ba310a73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21d2dd4efd74429aab05a84f55aaa4f9', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c736fd14-de19-447f-bed9-105e0da8c605', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.186'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79225713-54eb-4122-8bc5-04325a60d249, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=cea8b6c2-596d-48d7-8c77-8fcc98c0b60b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:07:44 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f8589881e79ac2638ea762377ad9e3ccdd9fd7e501202de7652c3c55ab07b9d5-userdata-shm.mount: Deactivated successfully.
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.741 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:44 compute-1 ovn_controller[95352]: 2026-01-26T09:07:44Z|00303|binding|INFO|Releasing lport cea8b6c2-596d-48d7-8c77-8fcc98c0b60b from this chassis (sb_readonly=0)
Jan 26 09:07:44 compute-1 systemd[1]: var-lib-containers-storage-overlay-8920cdb179c2717e5becd652e582fdd6cef4af4bcf85467454965ccbb5636459-merged.mount: Deactivated successfully.
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.760 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:4a:3f 10.100.0.45'], port_security=['fa:16:3e:fd:4a:3f 10.100.0.45'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b62a0d99-b340-4a52-961e-b6a31b1ea8c8'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.45/28', 'neutron:device_id': '6602f0c7-96fd-4c40-a71d-8909ba310a73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21d2dd4efd74429aab05a84f55aaa4f9', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c736fd14-de19-447f-bed9-105e0da8c605', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.186'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79225713-54eb-4122-8bc5-04325a60d249, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=cea8b6c2-596d-48d7-8c77-8fcc98c0b60b) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:07:44 compute-1 podman[224286]: 2026-01-26 09:07:44.770192353 +0000 UTC m=+0.141495743 container cleanup f8589881e79ac2638ea762377ad9e3ccdd9fd7e501202de7652c3c55ab07b9d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 09:07:44 compute-1 systemd[1]: libpod-conmon-f8589881e79ac2638ea762377ad9e3ccdd9fd7e501202de7652c3c55ab07b9d5.scope: Deactivated successfully.
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.781 183087 DEBUG nova.virt.libvirt.guest [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.782 183087 INFO nova.virt.libvirt.driver [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Migration operation has completed
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.782 183087 INFO nova.compute.manager [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] _post_live_migration() is started..
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.787 183087 DEBUG nova.virt.libvirt.driver [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.787 183087 DEBUG nova.virt.libvirt.driver [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.787 183087 DEBUG nova.virt.libvirt.driver [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 26 09:07:44 compute-1 podman[224333]: 2026-01-26 09:07:44.844455917 +0000 UTC m=+0.045745449 container remove f8589881e79ac2638ea762377ad9e3ccdd9fd7e501202de7652c3c55ab07b9d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.850 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[85c458de-7cb6-4212-a769-40f8443317b2]: (4, ('Mon Jan 26 09:07:44 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f (f8589881e79ac2638ea762377ad9e3ccdd9fd7e501202de7652c3c55ab07b9d5)\nf8589881e79ac2638ea762377ad9e3ccdd9fd7e501202de7652c3c55ab07b9d5\nMon Jan 26 09:07:44 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f (f8589881e79ac2638ea762377ad9e3ccdd9fd7e501202de7652c3c55ab07b9d5)\nf8589881e79ac2638ea762377ad9e3ccdd9fd7e501202de7652c3c55ab07b9d5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.852 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d5ef22-a392-432c-a760-8a578c377308]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.854 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa8b7bcc5-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.856 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:44 compute-1 kernel: tapa8b7bcc5-a0: left promiscuous mode
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.871 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:44 compute-1 nova_compute[183083]: 2026-01-26 09:07:44.872 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.874 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[e84738be-c732-4aee-a890-d9e8b250def0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.893 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[f387cbae-5ec5-4c06-a885-0de0709249de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.894 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[036db341-9f92-497d-b723-11ae265ed73b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.917 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ca60c1-98a2-4dc6-9efa-9200d019e868]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 471099, 'reachable_time': 40445, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224351, 'error': None, 'target': 'ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:07:44 compute-1 systemd[1]: run-netns-ovnmeta\x2da8b7bcc5\x2dafbc\x2d4c9d\x2d9487\x2d2698bb9a045f.mount: Deactivated successfully.
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.922 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a8b7bcc5-afbc-4c9d-9487-2698bb9a045f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.922 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[2da26ce7-b58f-4292-a2dc-70fbdb906aaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.923 104632 INFO neutron.agent.ovn.metadata.agent [-] Port cea8b6c2-596d-48d7-8c77-8fcc98c0b60b in datapath a8b7bcc5-afbc-4c9d-9487-2698bb9a045f unbound from our chassis
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.924 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a8b7bcc5-afbc-4c9d-9487-2698bb9a045f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.925 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[a8258820-0039-4abe-aae3-eb95f01c7c6d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.926 104632 INFO neutron.agent.ovn.metadata.agent [-] Port cea8b6c2-596d-48d7-8c77-8fcc98c0b60b in datapath a8b7bcc5-afbc-4c9d-9487-2698bb9a045f unbound from our chassis
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.927 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a8b7bcc5-afbc-4c9d-9487-2698bb9a045f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 09:07:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:07:44.928 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[2d3189db-6f05-469c-8a28-d0c6f5e86f25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.790 183087 DEBUG nova.compute.manager [req-76ed74e0-16f3-448b-8bfc-69a754a4bdf7 req-7011212c-8c29-452e-b7ac-3dc12f3545ba 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Received event network-vif-unplugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.791 183087 DEBUG oslo_concurrency.lockutils [req-76ed74e0-16f3-448b-8bfc-69a754a4bdf7 req-7011212c-8c29-452e-b7ac-3dc12f3545ba 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.791 183087 DEBUG oslo_concurrency.lockutils [req-76ed74e0-16f3-448b-8bfc-69a754a4bdf7 req-7011212c-8c29-452e-b7ac-3dc12f3545ba 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.792 183087 DEBUG oslo_concurrency.lockutils [req-76ed74e0-16f3-448b-8bfc-69a754a4bdf7 req-7011212c-8c29-452e-b7ac-3dc12f3545ba 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.792 183087 DEBUG nova.compute.manager [req-76ed74e0-16f3-448b-8bfc-69a754a4bdf7 req-7011212c-8c29-452e-b7ac-3dc12f3545ba 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] No waiting events found dispatching network-vif-unplugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.792 183087 DEBUG nova.compute.manager [req-76ed74e0-16f3-448b-8bfc-69a754a4bdf7 req-7011212c-8c29-452e-b7ac-3dc12f3545ba 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Received event network-vif-unplugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.898 183087 DEBUG nova.network.neutron [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Activated binding for port cea8b6c2-596d-48d7-8c77-8fcc98c0b60b and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.899 183087 DEBUG nova.compute.manager [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "address": "fa:16:3e:fd:4a:3f", "network": {"id": "a8b7bcc5-afbc-4c9d-9487-2698bb9a045f", "bridge": "br-int", "label": "tempest-test-network--1324066625", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea8b6c2-59", "ovs_interfaceid": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.900 183087 DEBUG nova.virt.libvirt.vif [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T09:06:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-1740506536',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1740506536',id=49,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIFn3fPhxboXbgrEFzlTwh4C8Ll+n3HIgPrVu0g3HSyFNAN7PcvrywcLTWeUOpJlGU28isdDSO43TrUU7yocpXOGs6my4rqjK3p4DmJix6rCv9t8FEPa+QZ5iu6nhBpQaw==',key_name='tempest-keypair-test-421938435',keypairs=<?>,launch_index=0,launched_at=2026-01-26T09:06:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='21d2dd4efd74429aab05a84f55aaa4f9',ramdisk_id='',reservation_id='r-j426xds6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-1505353326',owner_user_name='tempest-OvnDvrTest-1505353326-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T09:07:32Z,user_data=None,user_id='f62d05840e2a48b2a2e3a2c53715fc82',uuid=6602f0c7-96fd-4c40-a71d-8909ba310a73,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "address": "fa:16:3e:fd:4a:3f", "network": {"id": "a8b7bcc5-afbc-4c9d-9487-2698bb9a045f", "bridge": "br-int", "label": "tempest-test-network--1324066625", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea8b6c2-59", "ovs_interfaceid": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.901 183087 DEBUG nova.network.os_vif_util [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converting VIF {"id": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "address": "fa:16:3e:fd:4a:3f", "network": {"id": "a8b7bcc5-afbc-4c9d-9487-2698bb9a045f", "bridge": "br-int", "label": "tempest-test-network--1324066625", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea8b6c2-59", "ovs_interfaceid": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.901 183087 DEBUG nova.network.os_vif_util [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fd:4a:3f,bridge_name='br-int',has_traffic_filtering=True,id=cea8b6c2-596d-48d7-8c77-8fcc98c0b60b,network=Network(a8b7bcc5-afbc-4c9d-9487-2698bb9a045f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcea8b6c2-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.902 183087 DEBUG os_vif [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:4a:3f,bridge_name='br-int',has_traffic_filtering=True,id=cea8b6c2-596d-48d7-8c77-8fcc98c0b60b,network=Network(a8b7bcc5-afbc-4c9d-9487-2698bb9a045f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcea8b6c2-59') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.904 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.904 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcea8b6c2-59, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.906 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.908 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.910 183087 INFO os_vif [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:4a:3f,bridge_name='br-int',has_traffic_filtering=True,id=cea8b6c2-596d-48d7-8c77-8fcc98c0b60b,network=Network(a8b7bcc5-afbc-4c9d-9487-2698bb9a045f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcea8b6c2-59')
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.911 183087 DEBUG oslo_concurrency.lockutils [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.911 183087 DEBUG oslo_concurrency.lockutils [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.912 183087 DEBUG oslo_concurrency.lockutils [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.912 183087 DEBUG nova.compute.manager [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.913 183087 INFO nova.virt.libvirt.driver [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Deleting instance files /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73_del
Jan 26 09:07:45 compute-1 nova_compute[183083]: 2026-01-26 09:07:45.914 183087 INFO nova.virt.libvirt.driver [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Deletion of /var/lib/nova/instances/6602f0c7-96fd-4c40-a71d-8909ba310a73_del complete
Jan 26 09:07:46 compute-1 nova_compute[183083]: 2026-01-26 09:07:46.254 183087 DEBUG nova.network.neutron [req-97be3fc1-eb9b-490f-9125-47ae344f32db req-2020ab86-6349-4705-b791-4751f722a554 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Updated VIF entry in instance network info cache for port cea8b6c2-596d-48d7-8c77-8fcc98c0b60b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:07:46 compute-1 nova_compute[183083]: 2026-01-26 09:07:46.255 183087 DEBUG nova.network.neutron [req-97be3fc1-eb9b-490f-9125-47ae344f32db req-2020ab86-6349-4705-b791-4751f722a554 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Updating instance_info_cache with network_info: [{"id": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "address": "fa:16:3e:fd:4a:3f", "network": {"id": "a8b7bcc5-afbc-4c9d-9487-2698bb9a045f", "bridge": "br-int", "label": "tempest-test-network--1324066625", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcea8b6c2-59", "ovs_interfaceid": "cea8b6c2-596d-48d7-8c77-8fcc98c0b60b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:07:46 compute-1 nova_compute[183083]: 2026-01-26 09:07:46.278 183087 DEBUG oslo_concurrency.lockutils [req-97be3fc1-eb9b-490f-9125-47ae344f32db req-2020ab86-6349-4705-b791-4751f722a554 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-6602f0c7-96fd-4c40-a71d-8909ba310a73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:07:47 compute-1 nova_compute[183083]: 2026-01-26 09:07:47.875 183087 DEBUG nova.compute.manager [req-ca0b0324-0399-4ef6-b03a-34495d9175da req-0f38a58b-82db-4218-aa15-85c555e17171 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Received event network-vif-plugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:07:47 compute-1 nova_compute[183083]: 2026-01-26 09:07:47.875 183087 DEBUG oslo_concurrency.lockutils [req-ca0b0324-0399-4ef6-b03a-34495d9175da req-0f38a58b-82db-4218-aa15-85c555e17171 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:07:47 compute-1 nova_compute[183083]: 2026-01-26 09:07:47.876 183087 DEBUG oslo_concurrency.lockutils [req-ca0b0324-0399-4ef6-b03a-34495d9175da req-0f38a58b-82db-4218-aa15-85c555e17171 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:07:47 compute-1 nova_compute[183083]: 2026-01-26 09:07:47.877 183087 DEBUG oslo_concurrency.lockutils [req-ca0b0324-0399-4ef6-b03a-34495d9175da req-0f38a58b-82db-4218-aa15-85c555e17171 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:07:47 compute-1 nova_compute[183083]: 2026-01-26 09:07:47.877 183087 DEBUG nova.compute.manager [req-ca0b0324-0399-4ef6-b03a-34495d9175da req-0f38a58b-82db-4218-aa15-85c555e17171 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] No waiting events found dispatching network-vif-plugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:07:47 compute-1 nova_compute[183083]: 2026-01-26 09:07:47.877 183087 WARNING nova.compute.manager [req-ca0b0324-0399-4ef6-b03a-34495d9175da req-0f38a58b-82db-4218-aa15-85c555e17171 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Received unexpected event network-vif-plugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b for instance with vm_state active and task_state migrating.
Jan 26 09:07:47 compute-1 nova_compute[183083]: 2026-01-26 09:07:47.878 183087 DEBUG nova.compute.manager [req-ca0b0324-0399-4ef6-b03a-34495d9175da req-0f38a58b-82db-4218-aa15-85c555e17171 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Received event network-vif-plugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:07:47 compute-1 nova_compute[183083]: 2026-01-26 09:07:47.878 183087 DEBUG oslo_concurrency.lockutils [req-ca0b0324-0399-4ef6-b03a-34495d9175da req-0f38a58b-82db-4218-aa15-85c555e17171 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:07:47 compute-1 nova_compute[183083]: 2026-01-26 09:07:47.879 183087 DEBUG oslo_concurrency.lockutils [req-ca0b0324-0399-4ef6-b03a-34495d9175da req-0f38a58b-82db-4218-aa15-85c555e17171 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:07:47 compute-1 nova_compute[183083]: 2026-01-26 09:07:47.879 183087 DEBUG oslo_concurrency.lockutils [req-ca0b0324-0399-4ef6-b03a-34495d9175da req-0f38a58b-82db-4218-aa15-85c555e17171 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:07:47 compute-1 nova_compute[183083]: 2026-01-26 09:07:47.879 183087 DEBUG nova.compute.manager [req-ca0b0324-0399-4ef6-b03a-34495d9175da req-0f38a58b-82db-4218-aa15-85c555e17171 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] No waiting events found dispatching network-vif-plugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:07:47 compute-1 nova_compute[183083]: 2026-01-26 09:07:47.880 183087 WARNING nova.compute.manager [req-ca0b0324-0399-4ef6-b03a-34495d9175da req-0f38a58b-82db-4218-aa15-85c555e17171 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Received unexpected event network-vif-plugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b for instance with vm_state active and task_state migrating.
Jan 26 09:07:48 compute-1 nova_compute[183083]: 2026-01-26 09:07:48.203 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:50 compute-1 nova_compute[183083]: 2026-01-26 09:07:50.173 183087 DEBUG nova.compute.manager [req-b6065be2-71d0-4dad-a112-3dae4a4d1bfb req-5b159376-bdca-4331-8900-13984847dbb7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Received event network-vif-plugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:07:50 compute-1 nova_compute[183083]: 2026-01-26 09:07:50.173 183087 DEBUG oslo_concurrency.lockutils [req-b6065be2-71d0-4dad-a112-3dae4a4d1bfb req-5b159376-bdca-4331-8900-13984847dbb7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:07:50 compute-1 nova_compute[183083]: 2026-01-26 09:07:50.173 183087 DEBUG oslo_concurrency.lockutils [req-b6065be2-71d0-4dad-a112-3dae4a4d1bfb req-5b159376-bdca-4331-8900-13984847dbb7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:07:50 compute-1 nova_compute[183083]: 2026-01-26 09:07:50.174 183087 DEBUG oslo_concurrency.lockutils [req-b6065be2-71d0-4dad-a112-3dae4a4d1bfb req-5b159376-bdca-4331-8900-13984847dbb7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:07:50 compute-1 nova_compute[183083]: 2026-01-26 09:07:50.174 183087 DEBUG nova.compute.manager [req-b6065be2-71d0-4dad-a112-3dae4a4d1bfb req-5b159376-bdca-4331-8900-13984847dbb7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] No waiting events found dispatching network-vif-plugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:07:50 compute-1 nova_compute[183083]: 2026-01-26 09:07:50.174 183087 WARNING nova.compute.manager [req-b6065be2-71d0-4dad-a112-3dae4a4d1bfb req-5b159376-bdca-4331-8900-13984847dbb7 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Received unexpected event network-vif-plugged-cea8b6c2-596d-48d7-8c77-8fcc98c0b60b for instance with vm_state active and task_state migrating.
Jan 26 09:07:50 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Jan 26 09:07:50 compute-1 systemd[224216]: Activating special unit Exit the Session...
Jan 26 09:07:50 compute-1 systemd[224216]: Stopped target Main User Target.
Jan 26 09:07:50 compute-1 systemd[224216]: Stopped target Basic System.
Jan 26 09:07:50 compute-1 systemd[224216]: Stopped target Paths.
Jan 26 09:07:50 compute-1 systemd[224216]: Stopped target Sockets.
Jan 26 09:07:50 compute-1 systemd[224216]: Stopped target Timers.
Jan 26 09:07:50 compute-1 systemd[224216]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 26 09:07:50 compute-1 systemd[224216]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 26 09:07:50 compute-1 systemd[224216]: Closed D-Bus User Message Bus Socket.
Jan 26 09:07:50 compute-1 systemd[224216]: Stopped Create User's Volatile Files and Directories.
Jan 26 09:07:50 compute-1 systemd[224216]: Removed slice User Application Slice.
Jan 26 09:07:50 compute-1 systemd[224216]: Reached target Shutdown.
Jan 26 09:07:50 compute-1 systemd[224216]: Finished Exit the Session.
Jan 26 09:07:50 compute-1 systemd[224216]: Reached target Exit the Session.
Jan 26 09:07:50 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Jan 26 09:07:50 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Jan 26 09:07:50 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 26 09:07:50 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 26 09:07:50 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 26 09:07:50 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 26 09:07:50 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Jan 26 09:07:50 compute-1 nova_compute[183083]: 2026-01-26 09:07:50.907 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:51 compute-1 nova_compute[183083]: 2026-01-26 09:07:51.639 183087 DEBUG oslo_concurrency.lockutils [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:07:51 compute-1 nova_compute[183083]: 2026-01-26 09:07:51.639 183087 DEBUG oslo_concurrency.lockutils [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:07:51 compute-1 nova_compute[183083]: 2026-01-26 09:07:51.640 183087 DEBUG oslo_concurrency.lockutils [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "6602f0c7-96fd-4c40-a71d-8909ba310a73-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:07:51 compute-1 nova_compute[183083]: 2026-01-26 09:07:51.664 183087 DEBUG oslo_concurrency.lockutils [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:07:51 compute-1 nova_compute[183083]: 2026-01-26 09:07:51.665 183087 DEBUG oslo_concurrency.lockutils [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:07:51 compute-1 nova_compute[183083]: 2026-01-26 09:07:51.665 183087 DEBUG oslo_concurrency.lockutils [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:07:51 compute-1 nova_compute[183083]: 2026-01-26 09:07:51.665 183087 DEBUG nova.compute.resource_tracker [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:07:51 compute-1 nova_compute[183083]: 2026-01-26 09:07:51.862 183087 WARNING nova.virt.libvirt.driver [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:07:51 compute-1 nova_compute[183083]: 2026-01-26 09:07:51.863 183087 DEBUG nova.compute.resource_tracker [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13639MB free_disk=113.09348678588867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:07:51 compute-1 nova_compute[183083]: 2026-01-26 09:07:51.864 183087 DEBUG oslo_concurrency.lockutils [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:07:51 compute-1 nova_compute[183083]: 2026-01-26 09:07:51.864 183087 DEBUG oslo_concurrency.lockutils [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:07:51 compute-1 nova_compute[183083]: 2026-01-26 09:07:51.910 183087 DEBUG nova.compute.resource_tracker [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Migration for instance 6602f0c7-96fd-4c40-a71d-8909ba310a73 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 26 09:07:51 compute-1 nova_compute[183083]: 2026-01-26 09:07:51.932 183087 DEBUG nova.compute.resource_tracker [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 26 09:07:51 compute-1 nova_compute[183083]: 2026-01-26 09:07:51.959 183087 DEBUG nova.compute.resource_tracker [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Migration 169c411b-83a0-44b3-972f-3bbc4b44e558 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 26 09:07:51 compute-1 nova_compute[183083]: 2026-01-26 09:07:51.959 183087 DEBUG nova.compute.resource_tracker [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:07:51 compute-1 nova_compute[183083]: 2026-01-26 09:07:51.959 183087 DEBUG nova.compute.resource_tracker [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:07:52 compute-1 nova_compute[183083]: 2026-01-26 09:07:52.002 183087 DEBUG nova.compute.provider_tree [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:07:52 compute-1 nova_compute[183083]: 2026-01-26 09:07:52.014 183087 DEBUG nova.scheduler.client.report [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:07:52 compute-1 nova_compute[183083]: 2026-01-26 09:07:52.034 183087 DEBUG nova.compute.resource_tracker [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:07:52 compute-1 nova_compute[183083]: 2026-01-26 09:07:52.035 183087 DEBUG oslo_concurrency.lockutils [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:07:52 compute-1 nova_compute[183083]: 2026-01-26 09:07:52.039 183087 INFO nova.compute.manager [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Jan 26 09:07:52 compute-1 nova_compute[183083]: 2026-01-26 09:07:52.129 183087 INFO nova.scheduler.client.report [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Deleted allocation for migration 169c411b-83a0-44b3-972f-3bbc4b44e558
Jan 26 09:07:52 compute-1 nova_compute[183083]: 2026-01-26 09:07:52.129 183087 DEBUG nova.virt.libvirt.driver [None req-9e873163-bb63-4901-93a9-ee16092bc2af 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 26 09:07:53 compute-1 nova_compute[183083]: 2026-01-26 09:07:53.204 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:55 compute-1 sshd-session[224355]: Accepted publickey for zuul from 38.102.83.66 port 44992 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:07:55 compute-1 systemd-logind[788]: New session 99 of user zuul.
Jan 26 09:07:55 compute-1 systemd[1]: Started Session 99 of User zuul.
Jan 26 09:07:55 compute-1 sshd-session[224355]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:07:55 compute-1 podman[224359]: 2026-01-26 09:07:55.245237426 +0000 UTC m=+0.086266720 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 26 09:07:55 compute-1 podman[224360]: 2026-01-26 09:07:55.292189337 +0000 UTC m=+0.115035093 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Jan 26 09:07:55 compute-1 podman[224366]: 2026-01-26 09:07:55.292848116 +0000 UTC m=+0.111221377 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 09:07:55 compute-1 podman[224372]: 2026-01-26 09:07:55.297253339 +0000 UTC m=+0.110887868 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 09:07:55 compute-1 podman[224357]: 2026-01-26 09:07:55.311898158 +0000 UTC m=+0.149263420 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 09:07:55 compute-1 sshd-session[224458]: Accepted publickey for zuul from 38.102.83.66 port 45000 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:07:55 compute-1 systemd-logind[788]: New session 100 of user zuul.
Jan 26 09:07:55 compute-1 systemd[1]: Started Session 100 of User zuul.
Jan 26 09:07:55 compute-1 sshd-session[224458]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:07:55 compute-1 sudo[224467]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:07:55 compute-1 sudo[224467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:07:55 compute-1 sudo[224467]: pam_unix(sudo:session): session closed for user root
Jan 26 09:07:55 compute-1 sudo[224492]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni eth1 icmp and ether host fa:16:3e:1d:21:45 -w /tmp/tmp.FDaQ0sRbgv
Jan 26 09:07:55 compute-1 sudo[224492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:07:55 compute-1 sshd-session[224466]: Connection closed by 38.102.83.66 port 45000
Jan 26 09:07:55 compute-1 sshd-session[224458]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:07:55 compute-1 systemd[1]: session-100.scope: Deactivated successfully.
Jan 26 09:07:55 compute-1 systemd-logind[788]: Session 100 logged out. Waiting for processes to exit.
Jan 26 09:07:55 compute-1 systemd-logind[788]: Removed session 100.
Jan 26 09:07:55 compute-1 nova_compute[183083]: 2026-01-26 09:07:55.909 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:58 compute-1 nova_compute[183083]: 2026-01-26 09:07:58.207 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:07:59 compute-1 nova_compute[183083]: 2026-01-26 09:07:59.787 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769418464.781364, 6602f0c7-96fd-4c40-a71d-8909ba310a73 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:07:59 compute-1 nova_compute[183083]: 2026-01-26 09:07:59.787 183087 INFO nova.compute.manager [-] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] VM Stopped (Lifecycle Event)
Jan 26 09:07:59 compute-1 nova_compute[183083]: 2026-01-26 09:07:59.806 183087 DEBUG nova.compute.manager [None req-3fab9bc2-d3d8-45e7-b166-e6e983280343 - - - - - -] [instance: 6602f0c7-96fd-4c40-a71d-8909ba310a73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:08:00 compute-1 nova_compute[183083]: 2026-01-26 09:08:00.912 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:03 compute-1 nova_compute[183083]: 2026-01-26 09:08:03.210 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.745 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:08:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:08:04 compute-1 sshd-session[224519]: Accepted publickey for zuul from 38.102.83.66 port 55022 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:08:04 compute-1 systemd-logind[788]: New session 101 of user zuul.
Jan 26 09:08:04 compute-1 systemd[1]: Started Session 101 of User zuul.
Jan 26 09:08:04 compute-1 sshd-session[224519]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:08:04 compute-1 sudo[224523]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.FDaQ0sRbgv
Jan 26 09:08:04 compute-1 sudo[224523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:08:04 compute-1 sudo[224523]: pam_unix(sudo:session): session closed for user root
Jan 26 09:08:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:05.322 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:08:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:05.322 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:08:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:05.323 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:08:05 compute-1 nova_compute[183083]: 2026-01-26 09:08:05.913 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:08 compute-1 nova_compute[183083]: 2026-01-26 09:08:08.036 183087 DEBUG nova.virt.libvirt.driver [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Creating tmpfile /var/lib/nova/instances/tmp3sbx2jm0 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Jan 26 09:08:08 compute-1 nova_compute[183083]: 2026-01-26 09:08:08.037 183087 DEBUG nova.compute.manager [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=115712,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3sbx2jm0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Jan 26 09:08:08 compute-1 nova_compute[183083]: 2026-01-26 09:08:08.212 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:09 compute-1 nova_compute[183083]: 2026-01-26 09:08:09.729 183087 DEBUG nova.compute.manager [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=115712,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3sbx2jm0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3ac41ae0-4048-4fb5-a04e-674402b451d5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Jan 26 09:08:09 compute-1 nova_compute[183083]: 2026-01-26 09:08:09.752 183087 DEBUG oslo_concurrency.lockutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "refresh_cache-3ac41ae0-4048-4fb5-a04e-674402b451d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:08:09 compute-1 nova_compute[183083]: 2026-01-26 09:08:09.753 183087 DEBUG oslo_concurrency.lockutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquired lock "refresh_cache-3ac41ae0-4048-4fb5-a04e-674402b451d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:08:09 compute-1 nova_compute[183083]: 2026-01-26 09:08:09.753 183087 DEBUG nova.network.neutron [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 09:08:10 compute-1 podman[224549]: 2026-01-26 09:08:10.826401278 +0000 UTC m=+0.085809688 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 09:08:10 compute-1 nova_compute[183083]: 2026-01-26 09:08:10.915 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:11 compute-1 nova_compute[183083]: 2026-01-26 09:08:11.755 183087 DEBUG nova.network.neutron [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Updating instance_info_cache with network_info: [{"id": "cfcc2226-ac1c-48bd-8fbd-3eb565f832eb", "address": "fa:16:3e:b7:58:81", "network": {"id": "4a27cf2d-8058-49d7-ace8-3648ca1834f7", "bridge": "br-int", "label": "tempest-test-network--707199579", "subnets": [{"cidr": "10.100.0.48/28", "dns": [], "gateway": {"address": "10.100.0.49", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.58", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfcc2226-ac", "ovs_interfaceid": "cfcc2226-ac1c-48bd-8fbd-3eb565f832eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:08:11 compute-1 nova_compute[183083]: 2026-01-26 09:08:11.774 183087 DEBUG oslo_concurrency.lockutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Releasing lock "refresh_cache-3ac41ae0-4048-4fb5-a04e-674402b451d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:08:11 compute-1 nova_compute[183083]: 2026-01-26 09:08:11.776 183087 DEBUG nova.virt.libvirt.driver [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=115712,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3sbx2jm0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3ac41ae0-4048-4fb5-a04e-674402b451d5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Jan 26 09:08:11 compute-1 nova_compute[183083]: 2026-01-26 09:08:11.777 183087 DEBUG nova.virt.libvirt.driver [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Creating instance directory: /var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Jan 26 09:08:11 compute-1 nova_compute[183083]: 2026-01-26 09:08:11.778 183087 DEBUG nova.virt.libvirt.driver [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Creating disk.info with the contents: {'/var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5/disk': 'qcow2', '/var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Jan 26 09:08:11 compute-1 nova_compute[183083]: 2026-01-26 09:08:11.778 183087 DEBUG nova.virt.libvirt.driver [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Jan 26 09:08:11 compute-1 nova_compute[183083]: 2026-01-26 09:08:11.779 183087 DEBUG nova.objects.instance [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3ac41ae0-4048-4fb5-a04e-674402b451d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:08:11 compute-1 nova_compute[183083]: 2026-01-26 09:08:11.818 183087 DEBUG oslo_concurrency.processutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:08:11 compute-1 nova_compute[183083]: 2026-01-26 09:08:11.909 183087 DEBUG oslo_concurrency.processutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:08:11 compute-1 nova_compute[183083]: 2026-01-26 09:08:11.911 183087 DEBUG oslo_concurrency.lockutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:08:11 compute-1 nova_compute[183083]: 2026-01-26 09:08:11.912 183087 DEBUG oslo_concurrency.lockutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:08:11 compute-1 nova_compute[183083]: 2026-01-26 09:08:11.936 183087 DEBUG oslo_concurrency.processutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.014 183087 DEBUG oslo_concurrency.processutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.016 183087 DEBUG oslo_concurrency.processutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.071 183087 DEBUG oslo_concurrency.processutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.072 183087 DEBUG oslo_concurrency.lockutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.073 183087 DEBUG oslo_concurrency.processutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.153 183087 DEBUG oslo_concurrency.processutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.155 183087 DEBUG nova.virt.disk.api [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Checking if we can resize image /var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.155 183087 DEBUG oslo_concurrency.processutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.212 183087 DEBUG oslo_concurrency.processutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.214 183087 DEBUG nova.virt.disk.api [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Cannot resize image /var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.215 183087 DEBUG nova.objects.instance [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lazy-loading 'migration_context' on Instance uuid 3ac41ae0-4048-4fb5-a04e-674402b451d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.230 183087 DEBUG oslo_concurrency.processutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.254 183087 DEBUG oslo_concurrency.processutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5/disk.config 485376" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.255 183087 DEBUG nova.virt.libvirt.volume.remotefs [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5/disk.config to /var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.255 183087 DEBUG oslo_concurrency.processutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5/disk.config /var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.846 183087 DEBUG oslo_concurrency.processutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5/disk.config /var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5" returned: 0 in 0.590s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.847 183087 DEBUG nova.virt.libvirt.driver [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.850 183087 DEBUG nova.virt.libvirt.vif [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T09:06:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-357202870',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-357202870',id=50,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIFn3fPhxboXbgrEFzlTwh4C8Ll+n3HIgPrVu0g3HSyFNAN7PcvrywcLTWeUOpJlGU28isdDSO43TrUU7yocpXOGs6my4rqjK3p4DmJix6rCv9t8FEPa+QZ5iu6nhBpQaw==',key_name='tempest-keypair-test-421938435',keypairs=<?>,launch_index=0,launched_at=2026-01-26T09:06:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='21d2dd4efd74429aab05a84f55aaa4f9',ramdisk_id='',reservation_id='r-qms5o0n1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-1505353326',owner_user_name='tempest-OvnDvrTest-1505353326-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:06:50Z,user_data=None,user_id='f62d05840e2a48b2a2e3a2c53715fc82',uuid=3ac41ae0-4048-4fb5-a04e-674402b451d5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cfcc2226-ac1c-48bd-8fbd-3eb565f832eb", "address": "fa:16:3e:b7:58:81", "network": {"id": "4a27cf2d-8058-49d7-ace8-3648ca1834f7", "bridge": "br-int", "label": "tempest-test-network--707199579", "subnets": [{"cidr": "10.100.0.48/28", "dns": [], "gateway": {"address": "10.100.0.49", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.58", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapcfcc2226-ac", "ovs_interfaceid": "cfcc2226-ac1c-48bd-8fbd-3eb565f832eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.850 183087 DEBUG nova.network.os_vif_util [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converting VIF {"id": "cfcc2226-ac1c-48bd-8fbd-3eb565f832eb", "address": "fa:16:3e:b7:58:81", "network": {"id": "4a27cf2d-8058-49d7-ace8-3648ca1834f7", "bridge": "br-int", "label": "tempest-test-network--707199579", "subnets": [{"cidr": "10.100.0.48/28", "dns": [], "gateway": {"address": "10.100.0.49", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.58", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapcfcc2226-ac", "ovs_interfaceid": "cfcc2226-ac1c-48bd-8fbd-3eb565f832eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.852 183087 DEBUG nova.network.os_vif_util [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b7:58:81,bridge_name='br-int',has_traffic_filtering=True,id=cfcc2226-ac1c-48bd-8fbd-3eb565f832eb,network=Network(4a27cf2d-8058-49d7-ace8-3648ca1834f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfcc2226-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.853 183087 DEBUG os_vif [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:58:81,bridge_name='br-int',has_traffic_filtering=True,id=cfcc2226-ac1c-48bd-8fbd-3eb565f832eb,network=Network(4a27cf2d-8058-49d7-ace8-3648ca1834f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfcc2226-ac') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.855 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.856 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.856 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.860 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.861 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcfcc2226-ac, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.862 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcfcc2226-ac, col_values=(('external_ids', {'iface-id': 'cfcc2226-ac1c-48bd-8fbd-3eb565f832eb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b7:58:81', 'vm-uuid': '3ac41ae0-4048-4fb5-a04e-674402b451d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.864 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:12 compute-1 ovn_controller[95352]: 2026-01-26T09:08:12Z|00304|pinctrl|WARN|Dropped 371 log messages in last 50 seconds (most recently, 14 seconds ago) due to excessive rate
Jan 26 09:08:12 compute-1 NetworkManager[55451]: <info>  [1769418492.8663] manager: (tapcfcc2226-ac): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Jan 26 09:08:12 compute-1 ovn_controller[95352]: 2026-01-26T09:08:12Z|00305|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.869 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.872 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.873 183087 INFO os_vif [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:58:81,bridge_name='br-int',has_traffic_filtering=True,id=cfcc2226-ac1c-48bd-8fbd-3eb565f832eb,network=Network(4a27cf2d-8058-49d7-ace8-3648ca1834f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfcc2226-ac')
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.874 183087 DEBUG nova.virt.libvirt.driver [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Jan 26 09:08:12 compute-1 nova_compute[183083]: 2026-01-26 09:08:12.874 183087 DEBUG nova.compute.manager [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=115712,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3sbx2jm0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3ac41ae0-4048-4fb5-a04e-674402b451d5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Jan 26 09:08:13 compute-1 nova_compute[183083]: 2026-01-26 09:08:13.258 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:13 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:13.682 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:08:13 compute-1 nova_compute[183083]: 2026-01-26 09:08:13.682 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:13 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:13.683 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:08:14 compute-1 nova_compute[183083]: 2026-01-26 09:08:14.072 183087 DEBUG nova.network.neutron [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Port cfcc2226-ac1c-48bd-8fbd-3eb565f832eb updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Jan 26 09:08:14 compute-1 nova_compute[183083]: 2026-01-26 09:08:14.074 183087 DEBUG nova.compute.manager [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=115712,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3sbx2jm0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3ac41ae0-4048-4fb5-a04e-674402b451d5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Jan 26 09:08:14 compute-1 systemd[1]: Starting libvirt proxy daemon...
Jan 26 09:08:14 compute-1 systemd[1]: Started libvirt proxy daemon.
Jan 26 09:08:14 compute-1 kernel: tapcfcc2226-ac: entered promiscuous mode
Jan 26 09:08:14 compute-1 ovn_controller[95352]: 2026-01-26T09:08:14Z|00306|binding|INFO|Claiming lport cfcc2226-ac1c-48bd-8fbd-3eb565f832eb for this additional chassis.
Jan 26 09:08:14 compute-1 ovn_controller[95352]: 2026-01-26T09:08:14Z|00307|binding|INFO|cfcc2226-ac1c-48bd-8fbd-3eb565f832eb: Claiming fa:16:3e:b7:58:81 10.100.0.58
Jan 26 09:08:14 compute-1 nova_compute[183083]: 2026-01-26 09:08:14.446 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:14 compute-1 NetworkManager[55451]: <info>  [1769418494.4484] manager: (tapcfcc2226-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/106)
Jan 26 09:08:14 compute-1 ovn_controller[95352]: 2026-01-26T09:08:14Z|00308|binding|INFO|Setting lport cfcc2226-ac1c-48bd-8fbd-3eb565f832eb ovn-installed in OVS
Jan 26 09:08:14 compute-1 nova_compute[183083]: 2026-01-26 09:08:14.460 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:14 compute-1 nova_compute[183083]: 2026-01-26 09:08:14.463 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:14 compute-1 nova_compute[183083]: 2026-01-26 09:08:14.469 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:14 compute-1 systemd-udevd[224629]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 09:08:14 compute-1 systemd-machined[154360]: New machine qemu-19-instance-00000032.
Jan 26 09:08:14 compute-1 systemd[1]: Started Virtual Machine qemu-19-instance-00000032.
Jan 26 09:08:14 compute-1 NetworkManager[55451]: <info>  [1769418494.5124] device (tapcfcc2226-ac): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 09:08:14 compute-1 NetworkManager[55451]: <info>  [1769418494.5130] device (tapcfcc2226-ac): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 09:08:15 compute-1 nova_compute[183083]: 2026-01-26 09:08:15.244 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769418495.2443557, 3ac41ae0-4048-4fb5-a04e-674402b451d5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:08:15 compute-1 nova_compute[183083]: 2026-01-26 09:08:15.245 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] VM Started (Lifecycle Event)
Jan 26 09:08:15 compute-1 nova_compute[183083]: 2026-01-26 09:08:15.266 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:08:15 compute-1 nova_compute[183083]: 2026-01-26 09:08:15.973 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769418495.973018, 3ac41ae0-4048-4fb5-a04e-674402b451d5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:08:15 compute-1 nova_compute[183083]: 2026-01-26 09:08:15.973 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] VM Resumed (Lifecycle Event)
Jan 26 09:08:15 compute-1 nova_compute[183083]: 2026-01-26 09:08:15.996 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:08:16 compute-1 nova_compute[183083]: 2026-01-26 09:08:16.000 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 09:08:16 compute-1 nova_compute[183083]: 2026-01-26 09:08:16.019 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Jan 26 09:08:16 compute-1 nova_compute[183083]: 2026-01-26 09:08:16.994 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:08:16 compute-1 nova_compute[183083]: 2026-01-26 09:08:16.995 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:08:16 compute-1 nova_compute[183083]: 2026-01-26 09:08:16.995 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:08:17 compute-1 nova_compute[183083]: 2026-01-26 09:08:17.016 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:08:17 compute-1 ovn_controller[95352]: 2026-01-26T09:08:17Z|00309|binding|INFO|Claiming lport cfcc2226-ac1c-48bd-8fbd-3eb565f832eb for this chassis.
Jan 26 09:08:17 compute-1 ovn_controller[95352]: 2026-01-26T09:08:17Z|00310|binding|INFO|cfcc2226-ac1c-48bd-8fbd-3eb565f832eb: Claiming fa:16:3e:b7:58:81 10.100.0.58
Jan 26 09:08:17 compute-1 ovn_controller[95352]: 2026-01-26T09:08:17Z|00311|binding|INFO|Setting lport cfcc2226-ac1c-48bd-8fbd-3eb565f832eb up in Southbound
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.217 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:58:81 10.100.0.58'], port_security=['fa:16:3e:b7:58:81 10.100.0.58'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.58/28', 'neutron:device_id': '3ac41ae0-4048-4fb5-a04e-674402b451d5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a27cf2d-8058-49d7-ace8-3648ca1834f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21d2dd4efd74429aab05a84f55aaa4f9', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'c736fd14-de19-447f-bed9-105e0da8c605', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.211'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d614070d-6dd1-410d-952d-f73e6c0428a2, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=cfcc2226-ac1c-48bd-8fbd-3eb565f832eb) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.218 104632 INFO neutron.agent.ovn.metadata.agent [-] Port cfcc2226-ac1c-48bd-8fbd-3eb565f832eb in datapath 4a27cf2d-8058-49d7-ace8-3648ca1834f7 bound to our chassis
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.220 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a27cf2d-8058-49d7-ace8-3648ca1834f7
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.233 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[83ec716f-197c-4521-85a4-199a5e953fd4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.235 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4a27cf2d-81 in ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.237 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4a27cf2d-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.237 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[bed013a2-f4a8-4790-9ded-9c2a284fbbf8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.240 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[a2bde164-fbe2-44b3-aeec-374fc5b30d7a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.254 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[535f97ea-bf38-476e-bf27-aa8a4a34dbdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.285 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[e08ec51d-67ee-436c-a1b2-e1a5a9916be9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.325 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[0e17ba8e-43d5-4e69-87ad-c6ab9ea7d5bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:08:17 compute-1 NetworkManager[55451]: <info>  [1769418497.3379] manager: (tap4a27cf2d-80): new Veth device (/org/freedesktop/NetworkManager/Devices/107)
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.337 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[5553740f-1d36-4b66-a5fc-aaafce155e85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.383 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[1994bdc4-0bc4-476e-a2d4-159bbdc55df2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.388 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[d7a9e2a1-0d24-4506-b1f2-96753de49edb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:08:17 compute-1 NetworkManager[55451]: <info>  [1769418497.4253] device (tap4a27cf2d-80): carrier: link connected
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.434 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[fc33c140-1060-4684-8eff-89f82e835484]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.455 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[545db963-a9d1-4d58-a31c-f72ce9e8fba6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a27cf2d-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:73:9e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 483803, 'reachable_time': 39782, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224684, 'error': None, 'target': 'ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.479 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[105af843-5709-49fd-b149-acec88c36ae7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5c:739e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 483803, 'tstamp': 483803}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224685, 'error': None, 'target': 'ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:08:17 compute-1 nova_compute[183083]: 2026-01-26 09:08:17.489 183087 INFO nova.compute.manager [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Post operation of migration started
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.504 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[33f250d8-7cfd-44d0-91c9-2c40d681e500]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a27cf2d-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:73:9e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 483803, 'reachable_time': 39782, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224686, 'error': None, 'target': 'ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.548 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[c90d16e8-48a9-4b42-8769-e96ecb14d85b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.634 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[16e038a1-154e-428c-ac6c-a90ab5b18401]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.636 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a27cf2d-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.636 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.637 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a27cf2d-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:08:17 compute-1 nova_compute[183083]: 2026-01-26 09:08:17.638 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:17 compute-1 NetworkManager[55451]: <info>  [1769418497.6399] manager: (tap4a27cf2d-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Jan 26 09:08:17 compute-1 kernel: tap4a27cf2d-80: entered promiscuous mode
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.641 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a27cf2d-80, col_values=(('external_ids', {'iface-id': 'ebfdc9a9-e40f-4a26-b6eb-6ee151b5d1d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:08:17 compute-1 ovn_controller[95352]: 2026-01-26T09:08:17Z|00312|binding|INFO|Releasing lport ebfdc9a9-e40f-4a26-b6eb-6ee151b5d1d4 from this chassis (sb_readonly=0)
Jan 26 09:08:17 compute-1 nova_compute[183083]: 2026-01-26 09:08:17.642 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.644 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4a27cf2d-8058-49d7-ace8-3648ca1834f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4a27cf2d-8058-49d7-ace8-3648ca1834f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.644 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[a283fac7-f43c-470c-8bbb-2232a514b31c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.645 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: global
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-4a27cf2d-8058-49d7-ace8-3648ca1834f7
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/4a27cf2d-8058-49d7-ace8-3648ca1834f7.pid.haproxy
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID 4a27cf2d-8058-49d7-ace8-3648ca1834f7
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 09:08:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:17.646 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7', 'env', 'PROCESS_TAG=haproxy-4a27cf2d-8058-49d7-ace8-3648ca1834f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4a27cf2d-8058-49d7-ace8-3648ca1834f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 09:08:17 compute-1 nova_compute[183083]: 2026-01-26 09:08:17.655 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:17 compute-1 nova_compute[183083]: 2026-01-26 09:08:17.793 183087 DEBUG oslo_concurrency.lockutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "refresh_cache-3ac41ae0-4048-4fb5-a04e-674402b451d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:08:17 compute-1 nova_compute[183083]: 2026-01-26 09:08:17.793 183087 DEBUG oslo_concurrency.lockutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquired lock "refresh_cache-3ac41ae0-4048-4fb5-a04e-674402b451d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:08:17 compute-1 nova_compute[183083]: 2026-01-26 09:08:17.793 183087 DEBUG nova.network.neutron [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 09:08:17 compute-1 nova_compute[183083]: 2026-01-26 09:08:17.864 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:17 compute-1 nova_compute[183083]: 2026-01-26 09:08:17.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:08:18 compute-1 podman[224719]: 2026-01-26 09:08:18.01386506 +0000 UTC m=+0.053323761 container create f1cdb42ae2ad360248a9cfa7b7f156d81e799f112d82b1311b53b7b063fce273 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 09:08:18 compute-1 systemd[1]: Started libpod-conmon-f1cdb42ae2ad360248a9cfa7b7f156d81e799f112d82b1311b53b7b063fce273.scope.
Jan 26 09:08:18 compute-1 podman[224719]: 2026-01-26 09:08:17.987332919 +0000 UTC m=+0.026791660 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 09:08:18 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:08:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c85808c4ee60c808a228eb29f769d15865dfbb9de9864b29c5a7e73158e86ed/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 09:08:18 compute-1 podman[224719]: 2026-01-26 09:08:18.129730105 +0000 UTC m=+0.169188826 container init f1cdb42ae2ad360248a9cfa7b7f156d81e799f112d82b1311b53b7b063fce273 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 26 09:08:18 compute-1 podman[224719]: 2026-01-26 09:08:18.139170239 +0000 UTC m=+0.178628940 container start f1cdb42ae2ad360248a9cfa7b7f156d81e799f112d82b1311b53b7b063fce273 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 09:08:18 compute-1 neutron-haproxy-ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7[224734]: [NOTICE]   (224738) : New worker (224740) forked
Jan 26 09:08:18 compute-1 neutron-haproxy-ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7[224734]: [NOTICE]   (224738) : Loading success.
Jan 26 09:08:18 compute-1 nova_compute[183083]: 2026-01-26 09:08:18.309 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:18 compute-1 nova_compute[183083]: 2026-01-26 09:08:18.850 183087 DEBUG nova.network.neutron [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Updating instance_info_cache with network_info: [{"id": "cfcc2226-ac1c-48bd-8fbd-3eb565f832eb", "address": "fa:16:3e:b7:58:81", "network": {"id": "4a27cf2d-8058-49d7-ace8-3648ca1834f7", "bridge": "br-int", "label": "tempest-test-network--707199579", "subnets": [{"cidr": "10.100.0.48/28", "dns": [], "gateway": {"address": "10.100.0.49", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.58", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfcc2226-ac", "ovs_interfaceid": "cfcc2226-ac1c-48bd-8fbd-3eb565f832eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:08:18 compute-1 nova_compute[183083]: 2026-01-26 09:08:18.868 183087 DEBUG oslo_concurrency.lockutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Releasing lock "refresh_cache-3ac41ae0-4048-4fb5-a04e-674402b451d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:08:18 compute-1 nova_compute[183083]: 2026-01-26 09:08:18.882 183087 DEBUG oslo_concurrency.lockutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:08:18 compute-1 nova_compute[183083]: 2026-01-26 09:08:18.882 183087 DEBUG oslo_concurrency.lockutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:08:18 compute-1 nova_compute[183083]: 2026-01-26 09:08:18.883 183087 DEBUG oslo_concurrency.lockutils [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:08:18 compute-1 nova_compute[183083]: 2026-01-26 09:08:18.889 183087 INFO nova.virt.libvirt.driver [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 26 09:08:18 compute-1 virtqemud[182752]: Domain id=19 name='instance-00000032' uuid=3ac41ae0-4048-4fb5-a04e-674402b451d5 is tainted: custom-monitor
Jan 26 09:08:18 compute-1 sudo[223563]: pam_unix(sudo:session): session closed for user root
Jan 26 09:08:19 compute-1 sshd-session[224749]: Invalid user node from 2.57.122.238 port 49360
Jan 26 09:08:19 compute-1 sshd-session[224749]: Connection closed by invalid user node 2.57.122.238 port 49360 [preauth]
Jan 26 09:08:19 compute-1 nova_compute[183083]: 2026-01-26 09:08:19.898 183087 INFO nova.virt.libvirt.driver [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 26 09:08:20 compute-1 nova_compute[183083]: 2026-01-26 09:08:20.908 183087 INFO nova.virt.libvirt.driver [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 26 09:08:20 compute-1 nova_compute[183083]: 2026-01-26 09:08:20.915 183087 DEBUG nova.compute.manager [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:08:20 compute-1 nova_compute[183083]: 2026-01-26 09:08:20.936 183087 DEBUG nova.objects.instance [None req-b09bae80-d052-497b-910e-0def2a1c64a3 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 26 09:08:21 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:08:21.685 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:08:21 compute-1 nova_compute[183083]: 2026-01-26 09:08:21.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:08:22 compute-1 nova_compute[183083]: 2026-01-26 09:08:22.866 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:22 compute-1 sshd-session[224751]: Accepted publickey for zuul from 38.102.83.66 port 50202 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:08:22 compute-1 nova_compute[183083]: 2026-01-26 09:08:22.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:08:22 compute-1 nova_compute[183083]: 2026-01-26 09:08:22.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:08:22 compute-1 nova_compute[183083]: 2026-01-26 09:08:22.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:08:22 compute-1 systemd-logind[788]: New session 102 of user zuul.
Jan 26 09:08:22 compute-1 systemd[1]: Started Session 102 of User zuul.
Jan 26 09:08:22 compute-1 sshd-session[224751]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:08:23 compute-1 sshd-session[224755]: Accepted publickey for zuul from 38.102.83.66 port 50204 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:08:23 compute-1 systemd-logind[788]: New session 103 of user zuul.
Jan 26 09:08:23 compute-1 systemd[1]: Started Session 103 of User zuul.
Jan 26 09:08:23 compute-1 sshd-session[224755]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:08:23 compute-1 sudo[224759]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:08:23 compute-1 sudo[224759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:08:23 compute-1 sudo[224759]: pam_unix(sudo:session): session closed for user root
Jan 26 09:08:23 compute-1 nova_compute[183083]: 2026-01-26 09:08:23.311 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:23 compute-1 sudo[224784]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni eth1 icmp and ether host fa:16:3e:45:d7:2f -w /tmp/tmp.hRjSAm73Gv
Jan 26 09:08:23 compute-1 sudo[224784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:08:23 compute-1 sshd-session[224758]: Connection closed by 38.102.83.66 port 50204
Jan 26 09:08:23 compute-1 sshd-session[224755]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:08:23 compute-1 systemd[1]: session-103.scope: Deactivated successfully.
Jan 26 09:08:23 compute-1 systemd-logind[788]: Session 103 logged out. Waiting for processes to exit.
Jan 26 09:08:23 compute-1 systemd-logind[788]: Removed session 103.
Jan 26 09:08:23 compute-1 nova_compute[183083]: 2026-01-26 09:08:23.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:08:24 compute-1 nova_compute[183083]: 2026-01-26 09:08:24.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:08:24 compute-1 nova_compute[183083]: 2026-01-26 09:08:24.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:08:25 compute-1 podman[224825]: 2026-01-26 09:08:25.832941585 +0000 UTC m=+0.072601643 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 09:08:25 compute-1 podman[224816]: 2026-01-26 09:08:25.833037977 +0000 UTC m=+0.079682424 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 26 09:08:25 compute-1 podman[224810]: 2026-01-26 09:08:25.844567235 +0000 UTC m=+0.107921437 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 09:08:25 compute-1 podman[224812]: 2026-01-26 09:08:25.855398093 +0000 UTC m=+0.113040002 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, config_id=openstack_network_exporter)
Jan 26 09:08:25 compute-1 podman[224811]: 2026-01-26 09:08:25.870668816 +0000 UTC m=+0.123837378 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 09:08:26 compute-1 nova_compute[183083]: 2026-01-26 09:08:26.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:08:26 compute-1 nova_compute[183083]: 2026-01-26 09:08:26.981 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:08:26 compute-1 nova_compute[183083]: 2026-01-26 09:08:26.982 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:08:26 compute-1 nova_compute[183083]: 2026-01-26 09:08:26.982 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:08:26 compute-1 nova_compute[183083]: 2026-01-26 09:08:26.982 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:08:27 compute-1 nova_compute[183083]: 2026-01-26 09:08:27.064 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:08:27 compute-1 nova_compute[183083]: 2026-01-26 09:08:27.159 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:08:27 compute-1 nova_compute[183083]: 2026-01-26 09:08:27.160 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:08:27 compute-1 nova_compute[183083]: 2026-01-26 09:08:27.232 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:08:27 compute-1 nova_compute[183083]: 2026-01-26 09:08:27.430 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:08:27 compute-1 nova_compute[183083]: 2026-01-26 09:08:27.432 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13436MB free_disk=113.06249237060547GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:08:27 compute-1 nova_compute[183083]: 2026-01-26 09:08:27.432 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:08:27 compute-1 nova_compute[183083]: 2026-01-26 09:08:27.433 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:08:27 compute-1 nova_compute[183083]: 2026-01-26 09:08:27.507 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance 3ac41ae0-4048-4fb5-a04e-674402b451d5 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 09:08:27 compute-1 nova_compute[183083]: 2026-01-26 09:08:27.508 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:08:27 compute-1 nova_compute[183083]: 2026-01-26 09:08:27.508 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:08:27 compute-1 nova_compute[183083]: 2026-01-26 09:08:27.578 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:08:27 compute-1 nova_compute[183083]: 2026-01-26 09:08:27.592 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:08:27 compute-1 nova_compute[183083]: 2026-01-26 09:08:27.611 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:08:27 compute-1 nova_compute[183083]: 2026-01-26 09:08:27.611 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:08:27 compute-1 nova_compute[183083]: 2026-01-26 09:08:27.873 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:28 compute-1 nova_compute[183083]: 2026-01-26 09:08:28.314 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:32 compute-1 sshd-session[224919]: Accepted publickey for zuul from 38.102.83.66 port 48602 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:08:32 compute-1 systemd-logind[788]: New session 104 of user zuul.
Jan 26 09:08:32 compute-1 systemd[1]: Started Session 104 of User zuul.
Jan 26 09:08:32 compute-1 sshd-session[224919]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:08:32 compute-1 sudo[224923]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.hRjSAm73Gv
Jan 26 09:08:32 compute-1 sudo[224923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:08:32 compute-1 sudo[224923]: pam_unix(sudo:session): session closed for user root
Jan 26 09:08:32 compute-1 nova_compute[183083]: 2026-01-26 09:08:32.876 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:33 compute-1 nova_compute[183083]: 2026-01-26 09:08:33.317 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:34 compute-1 sshd-session[224949]: Accepted publickey for zuul from 38.102.83.66 port 38228 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:08:34 compute-1 systemd-logind[788]: New session 105 of user zuul.
Jan 26 09:08:34 compute-1 systemd[1]: Started Session 105 of User zuul.
Jan 26 09:08:34 compute-1 sshd-session[224949]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:08:34 compute-1 sshd-session[224953]: Accepted publickey for zuul from 38.102.83.66 port 38230 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:08:34 compute-1 systemd-logind[788]: New session 106 of user zuul.
Jan 26 09:08:34 compute-1 systemd[1]: Started Session 106 of User zuul.
Jan 26 09:08:34 compute-1 sshd-session[224953]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:08:34 compute-1 sudo[224957]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:08:34 compute-1 sudo[224957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:08:34 compute-1 sudo[224957]: pam_unix(sudo:session): session closed for user root
Jan 26 09:08:34 compute-1 sudo[224982]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni genev_sys_6081 'icmp and ((ether host fa:16:3e:fd:4a:3f and ether host fa:16:3e:9a:ee:88) or (ether host fa:16:3e:d6:3f:ea and ether host fa:16:3e:b7:58:81))' -w /tmp/tmp.l1O8OKtnWj
Jan 26 09:08:34 compute-1 sudo[224982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:08:34 compute-1 sshd-session[224956]: Connection closed by 38.102.83.66 port 38230
Jan 26 09:08:34 compute-1 sshd-session[224953]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:08:34 compute-1 systemd[1]: session-106.scope: Deactivated successfully.
Jan 26 09:08:34 compute-1 systemd-logind[788]: Session 106 logged out. Waiting for processes to exit.
Jan 26 09:08:34 compute-1 systemd-logind[788]: Removed session 106.
Jan 26 09:08:37 compute-1 nova_compute[183083]: 2026-01-26 09:08:37.879 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:38 compute-1 nova_compute[183083]: 2026-01-26 09:08:38.320 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:41 compute-1 podman[225008]: 2026-01-26 09:08:41.806460685 +0000 UTC m=+0.072949603 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 09:08:42 compute-1 nova_compute[183083]: 2026-01-26 09:08:42.882 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:43 compute-1 nova_compute[183083]: 2026-01-26 09:08:43.323 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:43 compute-1 sshd-session[225032]: Accepted publickey for zuul from 38.102.83.66 port 38238 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:08:43 compute-1 systemd-logind[788]: New session 107 of user zuul.
Jan 26 09:08:43 compute-1 systemd[1]: Started Session 107 of User zuul.
Jan 26 09:08:43 compute-1 sshd-session[225032]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:08:43 compute-1 sudo[225036]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.l1O8OKtnWj
Jan 26 09:08:43 compute-1 sudo[225036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:08:43 compute-1 sudo[225036]: pam_unix(sudo:session): session closed for user root
Jan 26 09:08:47 compute-1 ovn_controller[95352]: 2026-01-26T09:08:47Z|00313|memory_trim|INFO|Detected inactivity (last active 30016 ms ago): trimming memory
Jan 26 09:08:47 compute-1 nova_compute[183083]: 2026-01-26 09:08:47.884 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:48 compute-1 nova_compute[183083]: 2026-01-26 09:08:48.324 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:49 compute-1 sshd-session[225062]: Accepted publickey for zuul from 38.102.83.66 port 48748 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:08:49 compute-1 systemd-logind[788]: New session 108 of user zuul.
Jan 26 09:08:49 compute-1 systemd[1]: Started Session 108 of User zuul.
Jan 26 09:08:49 compute-1 sshd-session[225062]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:08:50 compute-1 sudo[225066]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.l1O8OKtnWj
Jan 26 09:08:50 compute-1 sudo[225066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:08:50 compute-1 sudo[225066]: pam_unix(sudo:session): session closed for user root
Jan 26 09:08:50 compute-1 sshd-session[225065]: Connection closed by 38.102.83.66 port 48748
Jan 26 09:08:50 compute-1 sshd-session[225062]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:08:50 compute-1 systemd[1]: session-108.scope: Deactivated successfully.
Jan 26 09:08:50 compute-1 systemd-logind[788]: Session 108 logged out. Waiting for processes to exit.
Jan 26 09:08:50 compute-1 systemd-logind[788]: Removed session 108.
Jan 26 09:08:52 compute-1 nova_compute[183083]: 2026-01-26 09:08:52.887 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:53 compute-1 nova_compute[183083]: 2026-01-26 09:08:53.327 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:54 compute-1 sudo[223907]: pam_unix(sudo:session): session closed for user root
Jan 26 09:08:56 compute-1 podman[225095]: 2026-01-26 09:08:56.861312565 +0000 UTC m=+0.088265748 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 09:08:56 compute-1 podman[225093]: 2026-01-26 09:08:56.866279726 +0000 UTC m=+0.109139351 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 09:08:56 compute-1 podman[225107]: 2026-01-26 09:08:56.870042353 +0000 UTC m=+0.085649504 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 09:08:56 compute-1 podman[225094]: 2026-01-26 09:08:56.903179544 +0000 UTC m=+0.133315968 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-type=git, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, version=9.6, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container)
Jan 26 09:08:56 compute-1 podman[225092]: 2026-01-26 09:08:56.929045919 +0000 UTC m=+0.175577429 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller)
Jan 26 09:08:57 compute-1 sshd-session[225194]: Accepted publickey for zuul from 38.102.83.66 port 36504 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:08:57 compute-1 systemd-logind[788]: New session 109 of user zuul.
Jan 26 09:08:57 compute-1 systemd[1]: Started Session 109 of User zuul.
Jan 26 09:08:57 compute-1 sshd-session[225194]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:08:57 compute-1 sudo[225198]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.hRjSAm73Gv
Jan 26 09:08:57 compute-1 sudo[225198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:08:57 compute-1 sudo[225198]: pam_unix(sudo:session): session closed for user root
Jan 26 09:08:57 compute-1 sshd-session[225197]: Connection closed by 38.102.83.66 port 36504
Jan 26 09:08:57 compute-1 sshd-session[225194]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:08:57 compute-1 systemd[1]: session-109.scope: Deactivated successfully.
Jan 26 09:08:57 compute-1 systemd-logind[788]: Session 109 logged out. Waiting for processes to exit.
Jan 26 09:08:57 compute-1 systemd-logind[788]: Removed session 109.
Jan 26 09:08:57 compute-1 nova_compute[183083]: 2026-01-26 09:08:57.890 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:08:58 compute-1 nova_compute[183083]: 2026-01-26 09:08:58.330 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:02 compute-1 nova_compute[183083]: 2026-01-26 09:09:02.894 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:03 compute-1 nova_compute[183083]: 2026-01-26 09:09:03.330 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:05 compute-1 sshd-session[225225]: Accepted publickey for zuul from 38.102.83.66 port 49936 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:09:05 compute-1 systemd-logind[788]: New session 110 of user zuul.
Jan 26 09:09:05 compute-1 systemd[1]: Started Session 110 of User zuul.
Jan 26 09:09:05 compute-1 sshd-session[225225]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:09:05 compute-1 sudo[225229]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.FDaQ0sRbgv
Jan 26 09:09:05 compute-1 sudo[225229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:09:05 compute-1 sudo[225229]: pam_unix(sudo:session): session closed for user root
Jan 26 09:09:05 compute-1 sshd-session[225228]: Connection closed by 38.102.83.66 port 49936
Jan 26 09:09:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:05.323 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:09:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:05.324 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:09:05 compute-1 sshd-session[225225]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:09:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:05.325 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:09:05 compute-1 systemd[1]: session-110.scope: Deactivated successfully.
Jan 26 09:09:05 compute-1 systemd-logind[788]: Session 110 logged out. Waiting for processes to exit.
Jan 26 09:09:05 compute-1 systemd-logind[788]: Removed session 110.
Jan 26 09:09:07 compute-1 nova_compute[183083]: 2026-01-26 09:09:07.896 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:08 compute-1 nova_compute[183083]: 2026-01-26 09:09:08.333 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:12 compute-1 sshd-session[225255]: Accepted publickey for zuul from 38.102.83.66 port 49946 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:09:12 compute-1 systemd-logind[788]: New session 111 of user zuul.
Jan 26 09:09:12 compute-1 systemd[1]: Started Session 111 of User zuul.
Jan 26 09:09:12 compute-1 sshd-session[225255]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:09:12 compute-1 podman[225258]: 2026-01-26 09:09:12.58248711 +0000 UTC m=+0.079776667 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 09:09:12 compute-1 sudo[225272]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.vGFvS4uNZ5
Jan 26 09:09:12 compute-1 sudo[225272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:09:12 compute-1 sudo[225272]: pam_unix(sudo:session): session closed for user root
Jan 26 09:09:12 compute-1 sshd-session[225259]: Connection closed by 38.102.83.66 port 49946
Jan 26 09:09:12 compute-1 sshd-session[225255]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:09:12 compute-1 systemd[1]: session-111.scope: Deactivated successfully.
Jan 26 09:09:12 compute-1 systemd-logind[788]: Session 111 logged out. Waiting for processes to exit.
Jan 26 09:09:12 compute-1 systemd-logind[788]: Removed session 111.
Jan 26 09:09:12 compute-1 nova_compute[183083]: 2026-01-26 09:09:12.931 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:13 compute-1 nova_compute[183083]: 2026-01-26 09:09:13.334 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:17 compute-1 ovn_controller[95352]: 2026-01-26T09:09:17Z|00314|pinctrl|WARN|Dropped 431 log messages in last 64 seconds (most recently, 18 seconds ago) due to excessive rate
Jan 26 09:09:17 compute-1 ovn_controller[95352]: 2026-01-26T09:09:17Z|00315|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:09:17 compute-1 nova_compute[183083]: 2026-01-26 09:09:17.934 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:18 compute-1 nova_compute[183083]: 2026-01-26 09:09:18.350 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:19 compute-1 nova_compute[183083]: 2026-01-26 09:09:19.611 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:09:19 compute-1 nova_compute[183083]: 2026-01-26 09:09:19.612 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:09:19 compute-1 nova_compute[183083]: 2026-01-26 09:09:19.612 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:09:19 compute-1 sshd-session[225310]: Accepted publickey for zuul from 38.102.83.66 port 44320 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:09:19 compute-1 systemd-logind[788]: New session 112 of user zuul.
Jan 26 09:09:19 compute-1 systemd[1]: Started Session 112 of User zuul.
Jan 26 09:09:19 compute-1 sshd-session[225310]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:09:19 compute-1 sudo[225314]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.QaGQB8Usyq
Jan 26 09:09:19 compute-1 sudo[225314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:09:19 compute-1 sudo[225314]: pam_unix(sudo:session): session closed for user root
Jan 26 09:09:20 compute-1 sshd-session[225313]: Connection closed by 38.102.83.66 port 44320
Jan 26 09:09:20 compute-1 sshd-session[225310]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:09:20 compute-1 systemd[1]: session-112.scope: Deactivated successfully.
Jan 26 09:09:20 compute-1 systemd-logind[788]: Session 112 logged out. Waiting for processes to exit.
Jan 26 09:09:20 compute-1 systemd-logind[788]: Removed session 112.
Jan 26 09:09:20 compute-1 nova_compute[183083]: 2026-01-26 09:09:20.512 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "refresh_cache-3ac41ae0-4048-4fb5-a04e-674402b451d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:09:20 compute-1 nova_compute[183083]: 2026-01-26 09:09:20.512 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquired lock "refresh_cache-3ac41ae0-4048-4fb5-a04e-674402b451d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:09:20 compute-1 nova_compute[183083]: 2026-01-26 09:09:20.512 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 09:09:20 compute-1 nova_compute[183083]: 2026-01-26 09:09:20.513 183087 DEBUG nova.objects.instance [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3ac41ae0-4048-4fb5-a04e-674402b451d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:09:22 compute-1 sudo[224027]: pam_unix(sudo:session): session closed for user root
Jan 26 09:09:22 compute-1 nova_compute[183083]: 2026-01-26 09:09:22.936 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:23 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:23.016 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.017 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:23 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:23.018 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.352 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.585 183087 DEBUG nova.compute.manager [req-23035f58-dda6-48b3-803c-b0eccad869b5 req-56a0e138-306f-408f-8e9b-31ba73e4f135 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Received event network-changed-cfcc2226-ac1c-48bd-8fbd-3eb565f832eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.585 183087 DEBUG nova.compute.manager [req-23035f58-dda6-48b3-803c-b0eccad869b5 req-56a0e138-306f-408f-8e9b-31ba73e4f135 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Refreshing instance network info cache due to event network-changed-cfcc2226-ac1c-48bd-8fbd-3eb565f832eb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.585 183087 DEBUG oslo_concurrency.lockutils [req-23035f58-dda6-48b3-803c-b0eccad869b5 req-56a0e138-306f-408f-8e9b-31ba73e4f135 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-3ac41ae0-4048-4fb5-a04e-674402b451d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.618 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Updating instance_info_cache with network_info: [{"id": "cfcc2226-ac1c-48bd-8fbd-3eb565f832eb", "address": "fa:16:3e:b7:58:81", "network": {"id": "4a27cf2d-8058-49d7-ace8-3648ca1834f7", "bridge": "br-int", "label": "tempest-test-network--707199579", "subnets": [{"cidr": "10.100.0.48/28", "dns": [], "gateway": {"address": "10.100.0.49", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.58", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfcc2226-ac", "ovs_interfaceid": "cfcc2226-ac1c-48bd-8fbd-3eb565f832eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.661 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Releasing lock "refresh_cache-3ac41ae0-4048-4fb5-a04e-674402b451d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.661 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.661 183087 DEBUG oslo_concurrency.lockutils [req-23035f58-dda6-48b3-803c-b0eccad869b5 req-56a0e138-306f-408f-8e9b-31ba73e4f135 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-3ac41ae0-4048-4fb5-a04e-674402b451d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.661 183087 DEBUG nova.network.neutron [req-23035f58-dda6-48b3-803c-b0eccad869b5 req-56a0e138-306f-408f-8e9b-31ba73e4f135 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Refreshing network info cache for port cfcc2226-ac1c-48bd-8fbd-3eb565f832eb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.662 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.663 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.663 183087 DEBUG oslo_concurrency.lockutils [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Acquiring lock "3ac41ae0-4048-4fb5-a04e-674402b451d5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.664 183087 DEBUG oslo_concurrency.lockutils [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lock "3ac41ae0-4048-4fb5-a04e-674402b451d5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.664 183087 DEBUG oslo_concurrency.lockutils [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Acquiring lock "3ac41ae0-4048-4fb5-a04e-674402b451d5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.664 183087 DEBUG oslo_concurrency.lockutils [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lock "3ac41ae0-4048-4fb5-a04e-674402b451d5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.664 183087 DEBUG oslo_concurrency.lockutils [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lock "3ac41ae0-4048-4fb5-a04e-674402b451d5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.665 183087 INFO nova.compute.manager [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Terminating instance
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.666 183087 DEBUG nova.compute.manager [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 09:09:23 compute-1 kernel: tapcfcc2226-ac (unregistering): left promiscuous mode
Jan 26 09:09:23 compute-1 NetworkManager[55451]: <info>  [1769418563.6996] device (tapcfcc2226-ac): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.703 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:23 compute-1 ovn_controller[95352]: 2026-01-26T09:09:23Z|00316|binding|INFO|Releasing lport cfcc2226-ac1c-48bd-8fbd-3eb565f832eb from this chassis (sb_readonly=0)
Jan 26 09:09:23 compute-1 ovn_controller[95352]: 2026-01-26T09:09:23Z|00317|binding|INFO|Setting lport cfcc2226-ac1c-48bd-8fbd-3eb565f832eb down in Southbound
Jan 26 09:09:23 compute-1 ovn_controller[95352]: 2026-01-26T09:09:23Z|00318|binding|INFO|Removing iface tapcfcc2226-ac ovn-installed in OVS
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.707 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:23 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:23.714 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:58:81 10.100.0.58'], port_security=['fa:16:3e:b7:58:81 10.100.0.58'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.58/28', 'neutron:device_id': '3ac41ae0-4048-4fb5-a04e-674402b451d5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a27cf2d-8058-49d7-ace8-3648ca1834f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21d2dd4efd74429aab05a84f55aaa4f9', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'c736fd14-de19-447f-bed9-105e0da8c605', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d614070d-6dd1-410d-952d-f73e6c0428a2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=cfcc2226-ac1c-48bd-8fbd-3eb565f832eb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:09:23 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:23.715 104632 INFO neutron.agent.ovn.metadata.agent [-] Port cfcc2226-ac1c-48bd-8fbd-3eb565f832eb in datapath 4a27cf2d-8058-49d7-ace8-3648ca1834f7 unbound from our chassis
Jan 26 09:09:23 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:23.718 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4a27cf2d-8058-49d7-ace8-3648ca1834f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 09:09:23 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:23.720 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[545985f5-7a54-4084-b04d-6f8589d4d80a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:09:23 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:23.721 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7 namespace which is not needed anymore
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.739 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:23 compute-1 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000032.scope: Deactivated successfully.
Jan 26 09:09:23 compute-1 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000032.scope: Consumed 4.885s CPU time.
Jan 26 09:09:23 compute-1 systemd-machined[154360]: Machine qemu-19-instance-00000032 terminated.
Jan 26 09:09:23 compute-1 neutron-haproxy-ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7[224734]: [NOTICE]   (224738) : haproxy version is 2.8.14-c23fe91
Jan 26 09:09:23 compute-1 neutron-haproxy-ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7[224734]: [NOTICE]   (224738) : path to executable is /usr/sbin/haproxy
Jan 26 09:09:23 compute-1 neutron-haproxy-ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7[224734]: [ALERT]    (224738) : Current worker (224740) exited with code 143 (Terminated)
Jan 26 09:09:23 compute-1 neutron-haproxy-ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7[224734]: [WARNING]  (224738) : All workers exited. Exiting... (0)
Jan 26 09:09:23 compute-1 systemd[1]: libpod-f1cdb42ae2ad360248a9cfa7b7f156d81e799f112d82b1311b53b7b063fce273.scope: Deactivated successfully.
Jan 26 09:09:23 compute-1 podman[225366]: 2026-01-26 09:09:23.911579137 +0000 UTC m=+0.060527310 container died f1cdb42ae2ad360248a9cfa7b7f156d81e799f112d82b1311b53b7b063fce273 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 26 09:09:23 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f1cdb42ae2ad360248a9cfa7b7f156d81e799f112d82b1311b53b7b063fce273-userdata-shm.mount: Deactivated successfully.
Jan 26 09:09:23 compute-1 systemd[1]: var-lib-containers-storage-overlay-2c85808c4ee60c808a228eb29f769d15865dfbb9de9864b29c5a7e73158e86ed-merged.mount: Deactivated successfully.
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:09:23 compute-1 podman[225366]: 2026-01-26 09:09:23.951713987 +0000 UTC m=+0.100662130 container cleanup f1cdb42ae2ad360248a9cfa7b7f156d81e799f112d82b1311b53b7b063fce273 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.959 183087 INFO nova.virt.libvirt.driver [-] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Instance destroyed successfully.
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.960 183087 DEBUG nova.objects.instance [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lazy-loading 'resources' on Instance uuid 3ac41ae0-4048-4fb5-a04e-674402b451d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.972 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.972 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.977 183087 DEBUG nova.virt.libvirt.vif [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-26T09:06:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-357202870',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-357202870',id=50,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIFn3fPhxboXbgrEFzlTwh4C8Ll+n3HIgPrVu0g3HSyFNAN7PcvrywcLTWeUOpJlGU28isdDSO43TrUU7yocpXOGs6my4rqjK3p4DmJix6rCv9t8FEPa+QZ5iu6nhBpQaw==',key_name='tempest-keypair-test-421938435',keypairs=<?>,launch_index=0,launched_at=2026-01-26T09:06:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='21d2dd4efd74429aab05a84f55aaa4f9',ramdisk_id='',reservation_id='r-qms5o0n1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-1505353326',owner_user_name='tempest-OvnDvrTest-1505353326-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T09:08:20Z,user_data=None,user_id='f62d05840e2a48b2a2e3a2c53715fc82',uuid=3ac41ae0-4048-4fb5-a04e-674402b451d5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cfcc2226-ac1c-48bd-8fbd-3eb565f832eb", "address": "fa:16:3e:b7:58:81", "network": {"id": "4a27cf2d-8058-49d7-ace8-3648ca1834f7", "bridge": "br-int", "label": "tempest-test-network--707199579", "subnets": [{"cidr": "10.100.0.48/28", "dns": [], "gateway": {"address": "10.100.0.49", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.58", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfcc2226-ac", "ovs_interfaceid": "cfcc2226-ac1c-48bd-8fbd-3eb565f832eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.978 183087 DEBUG nova.network.os_vif_util [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Converting VIF {"id": "cfcc2226-ac1c-48bd-8fbd-3eb565f832eb", "address": "fa:16:3e:b7:58:81", "network": {"id": "4a27cf2d-8058-49d7-ace8-3648ca1834f7", "bridge": "br-int", "label": "tempest-test-network--707199579", "subnets": [{"cidr": "10.100.0.48/28", "dns": [], "gateway": {"address": "10.100.0.49", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.58", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfcc2226-ac", "ovs_interfaceid": "cfcc2226-ac1c-48bd-8fbd-3eb565f832eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.979 183087 DEBUG nova.network.os_vif_util [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b7:58:81,bridge_name='br-int',has_traffic_filtering=True,id=cfcc2226-ac1c-48bd-8fbd-3eb565f832eb,network=Network(4a27cf2d-8058-49d7-ace8-3648ca1834f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfcc2226-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.980 183087 DEBUG os_vif [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:58:81,bridge_name='br-int',has_traffic_filtering=True,id=cfcc2226-ac1c-48bd-8fbd-3eb565f832eb,network=Network(4a27cf2d-8058-49d7-ace8-3648ca1834f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfcc2226-ac') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.982 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.983 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcfcc2226-ac, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.985 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.987 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:23 compute-1 systemd[1]: libpod-conmon-f1cdb42ae2ad360248a9cfa7b7f156d81e799f112d82b1311b53b7b063fce273.scope: Deactivated successfully.
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.990 183087 INFO os_vif [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:58:81,bridge_name='br-int',has_traffic_filtering=True,id=cfcc2226-ac1c-48bd-8fbd-3eb565f832eb,network=Network(4a27cf2d-8058-49d7-ace8-3648ca1834f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfcc2226-ac')
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.990 183087 INFO nova.virt.libvirt.driver [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Deleting instance files /var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5_del
Jan 26 09:09:23 compute-1 nova_compute[183083]: 2026-01-26 09:09:23.991 183087 INFO nova.virt.libvirt.driver [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Deletion of /var/lib/nova/instances/3ac41ae0-4048-4fb5-a04e-674402b451d5_del complete
Jan 26 09:09:24 compute-1 podman[225412]: 2026-01-26 09:09:24.035857817 +0000 UTC m=+0.047867171 container remove f1cdb42ae2ad360248a9cfa7b7f156d81e799f112d82b1311b53b7b063fce273 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 26 09:09:24 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:24.041 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[60bf796e-f688-452d-86f6-2af3a28b0f13]: (4, ('Mon Jan 26 09:09:23 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7 (f1cdb42ae2ad360248a9cfa7b7f156d81e799f112d82b1311b53b7b063fce273)\nf1cdb42ae2ad360248a9cfa7b7f156d81e799f112d82b1311b53b7b063fce273\nMon Jan 26 09:09:23 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7 (f1cdb42ae2ad360248a9cfa7b7f156d81e799f112d82b1311b53b7b063fce273)\nf1cdb42ae2ad360248a9cfa7b7f156d81e799f112d82b1311b53b7b063fce273\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:09:24 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:24.043 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[3608705e-136c-423e-a42e-7b7367ab15d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:09:24 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:24.045 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a27cf2d-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:09:24 compute-1 nova_compute[183083]: 2026-01-26 09:09:24.047 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:24 compute-1 kernel: tap4a27cf2d-80: left promiscuous mode
Jan 26 09:09:24 compute-1 nova_compute[183083]: 2026-01-26 09:09:24.058 183087 DEBUG nova.compute.manager [req-b6366a02-60e8-4fb5-aff7-306db889bb4e req-b0ae5e8b-883f-4e4e-95a5-dcb1578a6197 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Received event network-vif-unplugged-cfcc2226-ac1c-48bd-8fbd-3eb565f832eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:09:24 compute-1 nova_compute[183083]: 2026-01-26 09:09:24.059 183087 DEBUG oslo_concurrency.lockutils [req-b6366a02-60e8-4fb5-aff7-306db889bb4e req-b0ae5e8b-883f-4e4e-95a5-dcb1578a6197 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "3ac41ae0-4048-4fb5-a04e-674402b451d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:09:24 compute-1 nova_compute[183083]: 2026-01-26 09:09:24.059 183087 DEBUG oslo_concurrency.lockutils [req-b6366a02-60e8-4fb5-aff7-306db889bb4e req-b0ae5e8b-883f-4e4e-95a5-dcb1578a6197 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "3ac41ae0-4048-4fb5-a04e-674402b451d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:09:24 compute-1 nova_compute[183083]: 2026-01-26 09:09:24.060 183087 DEBUG oslo_concurrency.lockutils [req-b6366a02-60e8-4fb5-aff7-306db889bb4e req-b0ae5e8b-883f-4e4e-95a5-dcb1578a6197 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "3ac41ae0-4048-4fb5-a04e-674402b451d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:09:24 compute-1 nova_compute[183083]: 2026-01-26 09:09:24.060 183087 DEBUG nova.compute.manager [req-b6366a02-60e8-4fb5-aff7-306db889bb4e req-b0ae5e8b-883f-4e4e-95a5-dcb1578a6197 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] No waiting events found dispatching network-vif-unplugged-cfcc2226-ac1c-48bd-8fbd-3eb565f832eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:09:24 compute-1 nova_compute[183083]: 2026-01-26 09:09:24.061 183087 DEBUG nova.compute.manager [req-b6366a02-60e8-4fb5-aff7-306db889bb4e req-b0ae5e8b-883f-4e4e-95a5-dcb1578a6197 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Received event network-vif-unplugged-cfcc2226-ac1c-48bd-8fbd-3eb565f832eb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 09:09:24 compute-1 nova_compute[183083]: 2026-01-26 09:09:24.064 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:24 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:24.067 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[318d620a-8e9b-4145-b7a7-80244443374a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:09:24 compute-1 nova_compute[183083]: 2026-01-26 09:09:24.073 183087 INFO nova.compute.manager [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Took 0.41 seconds to destroy the instance on the hypervisor.
Jan 26 09:09:24 compute-1 nova_compute[183083]: 2026-01-26 09:09:24.074 183087 DEBUG oslo.service.loopingcall [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 09:09:24 compute-1 nova_compute[183083]: 2026-01-26 09:09:24.074 183087 DEBUG nova.compute.manager [-] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 09:09:24 compute-1 nova_compute[183083]: 2026-01-26 09:09:24.075 183087 DEBUG nova.network.neutron [-] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 09:09:24 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:24.079 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[117199b0-da54-48bf-8eb6-4cd00863b4be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:09:24 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:24.080 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[38ee506a-e08c-4946-b40d-1c8aed7cad16]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:09:24 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:24.101 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[e56c42ac-01ca-4ed3-aace-3dd8229f741d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 483792, 'reachable_time': 32447, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225427, 'error': None, 'target': 'ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:09:24 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:24.103 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4a27cf2d-8058-49d7-ace8-3648ca1834f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 09:09:24 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:24.103 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[e5f2b6f5-decb-4526-9f49-52dd28c62d5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:09:24 compute-1 systemd[1]: run-netns-ovnmeta\x2d4a27cf2d\x2d8058\x2d49d7\x2dace8\x2d3648ca1834f7.mount: Deactivated successfully.
Jan 26 09:09:24 compute-1 nova_compute[183083]: 2026-01-26 09:09:24.896 183087 DEBUG nova.network.neutron [-] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:09:24 compute-1 nova_compute[183083]: 2026-01-26 09:09:24.926 183087 INFO nova.compute.manager [-] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Took 0.85 seconds to deallocate network for instance.
Jan 26 09:09:24 compute-1 nova_compute[183083]: 2026-01-26 09:09:24.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:09:24 compute-1 nova_compute[183083]: 2026-01-26 09:09:24.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:09:24 compute-1 nova_compute[183083]: 2026-01-26 09:09:24.979 183087 DEBUG oslo_concurrency.lockutils [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:09:24 compute-1 nova_compute[183083]: 2026-01-26 09:09:24.980 183087 DEBUG oslo_concurrency.lockutils [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:09:25 compute-1 nova_compute[183083]: 2026-01-26 09:09:25.042 183087 DEBUG nova.compute.provider_tree [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:09:25 compute-1 nova_compute[183083]: 2026-01-26 09:09:25.082 183087 DEBUG nova.scheduler.client.report [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:09:25 compute-1 nova_compute[183083]: 2026-01-26 09:09:25.130 183087 DEBUG oslo_concurrency.lockutils [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:09:25 compute-1 nova_compute[183083]: 2026-01-26 09:09:25.180 183087 INFO nova.scheduler.client.report [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Deleted allocations for instance 3ac41ae0-4048-4fb5-a04e-674402b451d5
Jan 26 09:09:25 compute-1 nova_compute[183083]: 2026-01-26 09:09:25.232 183087 DEBUG nova.network.neutron [req-23035f58-dda6-48b3-803c-b0eccad869b5 req-56a0e138-306f-408f-8e9b-31ba73e4f135 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Updated VIF entry in instance network info cache for port cfcc2226-ac1c-48bd-8fbd-3eb565f832eb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:09:25 compute-1 nova_compute[183083]: 2026-01-26 09:09:25.233 183087 DEBUG nova.network.neutron [req-23035f58-dda6-48b3-803c-b0eccad869b5 req-56a0e138-306f-408f-8e9b-31ba73e4f135 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Updating instance_info_cache with network_info: [{"id": "cfcc2226-ac1c-48bd-8fbd-3eb565f832eb", "address": "fa:16:3e:b7:58:81", "network": {"id": "4a27cf2d-8058-49d7-ace8-3648ca1834f7", "bridge": "br-int", "label": "tempest-test-network--707199579", "subnets": [{"cidr": "10.100.0.48/28", "dns": [], "gateway": {"address": "10.100.0.49", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.58", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfcc2226-ac", "ovs_interfaceid": "cfcc2226-ac1c-48bd-8fbd-3eb565f832eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:09:25 compute-1 nova_compute[183083]: 2026-01-26 09:09:25.520 183087 DEBUG oslo_concurrency.lockutils [req-23035f58-dda6-48b3-803c-b0eccad869b5 req-56a0e138-306f-408f-8e9b-31ba73e4f135 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-3ac41ae0-4048-4fb5-a04e-674402b451d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:09:25 compute-1 nova_compute[183083]: 2026-01-26 09:09:25.646 183087 DEBUG oslo_concurrency.lockutils [None req-140d0f49-dd73-48ca-a8f4-ea5b30f5e820 f62d05840e2a48b2a2e3a2c53715fc82 21d2dd4efd74429aab05a84f55aaa4f9 - - default default] Lock "3ac41ae0-4048-4fb5-a04e-674402b451d5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.982s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:09:25 compute-1 nova_compute[183083]: 2026-01-26 09:09:25.716 183087 DEBUG nova.compute.manager [req-5f6652b8-baf8-412c-b07e-e098c101369b req-4020b559-3785-40df-8b14-685ac17b02fa 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Received event network-vif-deleted-cfcc2226-ac1c-48bd-8fbd-3eb565f832eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:09:25 compute-1 nova_compute[183083]: 2026-01-26 09:09:25.717 183087 INFO nova.compute.manager [req-5f6652b8-baf8-412c-b07e-e098c101369b req-4020b559-3785-40df-8b14-685ac17b02fa 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Neutron deleted interface cfcc2226-ac1c-48bd-8fbd-3eb565f832eb; detaching it from the instance and deleting it from the info cache
Jan 26 09:09:25 compute-1 nova_compute[183083]: 2026-01-26 09:09:25.717 183087 DEBUG nova.network.neutron [req-5f6652b8-baf8-412c-b07e-e098c101369b req-4020b559-3785-40df-8b14-685ac17b02fa 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 26 09:09:25 compute-1 nova_compute[183083]: 2026-01-26 09:09:25.720 183087 DEBUG nova.compute.manager [req-5f6652b8-baf8-412c-b07e-e098c101369b req-4020b559-3785-40df-8b14-685ac17b02fa 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Detach interface failed, port_id=cfcc2226-ac1c-48bd-8fbd-3eb565f832eb, reason: Instance 3ac41ae0-4048-4fb5-a04e-674402b451d5 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 26 09:09:25 compute-1 nova_compute[183083]: 2026-01-26 09:09:25.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:09:25 compute-1 nova_compute[183083]: 2026-01-26 09:09:25.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:09:26 compute-1 nova_compute[183083]: 2026-01-26 09:09:26.117 183087 DEBUG nova.compute.manager [req-b441759c-f943-45d1-b047-bdd40a7489be req-7b99d3ab-5506-474c-aeaa-5dd52f84007c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Received event network-vif-plugged-cfcc2226-ac1c-48bd-8fbd-3eb565f832eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:09:26 compute-1 nova_compute[183083]: 2026-01-26 09:09:26.118 183087 DEBUG oslo_concurrency.lockutils [req-b441759c-f943-45d1-b047-bdd40a7489be req-7b99d3ab-5506-474c-aeaa-5dd52f84007c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "3ac41ae0-4048-4fb5-a04e-674402b451d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:09:26 compute-1 nova_compute[183083]: 2026-01-26 09:09:26.118 183087 DEBUG oslo_concurrency.lockutils [req-b441759c-f943-45d1-b047-bdd40a7489be req-7b99d3ab-5506-474c-aeaa-5dd52f84007c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "3ac41ae0-4048-4fb5-a04e-674402b451d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:09:26 compute-1 nova_compute[183083]: 2026-01-26 09:09:26.119 183087 DEBUG oslo_concurrency.lockutils [req-b441759c-f943-45d1-b047-bdd40a7489be req-7b99d3ab-5506-474c-aeaa-5dd52f84007c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "3ac41ae0-4048-4fb5-a04e-674402b451d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:09:26 compute-1 nova_compute[183083]: 2026-01-26 09:09:26.119 183087 DEBUG nova.compute.manager [req-b441759c-f943-45d1-b047-bdd40a7489be req-7b99d3ab-5506-474c-aeaa-5dd52f84007c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] No waiting events found dispatching network-vif-plugged-cfcc2226-ac1c-48bd-8fbd-3eb565f832eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:09:26 compute-1 nova_compute[183083]: 2026-01-26 09:09:26.120 183087 WARNING nova.compute.manager [req-b441759c-f943-45d1-b047-bdd40a7489be req-7b99d3ab-5506-474c-aeaa-5dd52f84007c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Received unexpected event network-vif-plugged-cfcc2226-ac1c-48bd-8fbd-3eb565f832eb for instance with vm_state deleted and task_state None.
Jan 26 09:09:27 compute-1 podman[225431]: 2026-01-26 09:09:27.840325669 +0000 UTC m=+0.071030189 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 09:09:27 compute-1 podman[225430]: 2026-01-26 09:09:27.846867195 +0000 UTC m=+0.091564802 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 09:09:27 compute-1 podman[225429]: 2026-01-26 09:09:27.854856041 +0000 UTC m=+0.099359233 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 09:09:27 compute-1 podman[225428]: 2026-01-26 09:09:27.880466729 +0000 UTC m=+0.124529728 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 09:09:27 compute-1 podman[225448]: 2026-01-26 09:09:27.885911464 +0000 UTC m=+0.111398016 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:09:27 compute-1 nova_compute[183083]: 2026-01-26 09:09:27.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:09:27 compute-1 nova_compute[183083]: 2026-01-26 09:09:27.973 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:09:27 compute-1 nova_compute[183083]: 2026-01-26 09:09:27.974 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:09:27 compute-1 nova_compute[183083]: 2026-01-26 09:09:27.974 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:09:27 compute-1 nova_compute[183083]: 2026-01-26 09:09:27.975 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:09:28 compute-1 nova_compute[183083]: 2026-01-26 09:09:28.118 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:09:28 compute-1 nova_compute[183083]: 2026-01-26 09:09:28.119 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13606MB free_disk=113.091552734375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:09:28 compute-1 nova_compute[183083]: 2026-01-26 09:09:28.119 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:09:28 compute-1 nova_compute[183083]: 2026-01-26 09:09:28.120 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:09:28 compute-1 nova_compute[183083]: 2026-01-26 09:09:28.209 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:09:28 compute-1 nova_compute[183083]: 2026-01-26 09:09:28.210 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:09:28 compute-1 nova_compute[183083]: 2026-01-26 09:09:28.236 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:09:28 compute-1 nova_compute[183083]: 2026-01-26 09:09:28.256 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:09:28 compute-1 nova_compute[183083]: 2026-01-26 09:09:28.291 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:09:28 compute-1 nova_compute[183083]: 2026-01-26 09:09:28.291 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:09:28 compute-1 nova_compute[183083]: 2026-01-26 09:09:28.354 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:28 compute-1 nova_compute[183083]: 2026-01-26 09:09:28.985 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:30 compute-1 sshd-session[225537]: Accepted publickey for zuul from 38.102.83.66 port 57740 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:09:30 compute-1 systemd-logind[788]: New session 113 of user zuul.
Jan 26 09:09:30 compute-1 systemd[1]: Started Session 113 of User zuul.
Jan 26 09:09:30 compute-1 sshd-session[225537]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:09:30 compute-1 sudo[225541]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.qKcjuMXd7h
Jan 26 09:09:30 compute-1 sudo[225541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:09:30 compute-1 sudo[225541]: pam_unix(sudo:session): session closed for user root
Jan 26 09:09:30 compute-1 sshd-session[225540]: Connection closed by 38.102.83.66 port 57740
Jan 26 09:09:30 compute-1 sshd-session[225537]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:09:30 compute-1 systemd[1]: session-113.scope: Deactivated successfully.
Jan 26 09:09:30 compute-1 systemd-logind[788]: Session 113 logged out. Waiting for processes to exit.
Jan 26 09:09:30 compute-1 systemd-logind[788]: Removed session 113.
Jan 26 09:09:33 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:09:33.021 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:09:33 compute-1 nova_compute[183083]: 2026-01-26 09:09:33.355 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:33 compute-1 nova_compute[183083]: 2026-01-26 09:09:33.988 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:37 compute-1 nova_compute[183083]: 2026-01-26 09:09:37.991 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:38 compute-1 nova_compute[183083]: 2026-01-26 09:09:38.357 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:38 compute-1 nova_compute[183083]: 2026-01-26 09:09:38.949 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769418563.9481695, 3ac41ae0-4048-4fb5-a04e-674402b451d5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:09:38 compute-1 nova_compute[183083]: 2026-01-26 09:09:38.950 183087 INFO nova.compute.manager [-] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] VM Stopped (Lifecycle Event)
Jan 26 09:09:38 compute-1 nova_compute[183083]: 2026-01-26 09:09:38.972 183087 DEBUG nova.compute.manager [None req-92838e2b-a88f-4242-b2c9-b3e4a0392cda - - - - - -] [instance: 3ac41ae0-4048-4fb5-a04e-674402b451d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:09:39 compute-1 nova_compute[183083]: 2026-01-26 09:09:39.036 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:42 compute-1 podman[225567]: 2026-01-26 09:09:42.85169389 +0000 UTC m=+0.100184337 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:09:43 compute-1 nova_compute[183083]: 2026-01-26 09:09:43.360 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:44 compute-1 nova_compute[183083]: 2026-01-26 09:09:44.039 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:45 compute-1 nova_compute[183083]: 2026-01-26 09:09:45.045 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:48 compute-1 nova_compute[183083]: 2026-01-26 09:09:48.362 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:49 compute-1 nova_compute[183083]: 2026-01-26 09:09:49.042 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:53 compute-1 nova_compute[183083]: 2026-01-26 09:09:53.365 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:54 compute-1 nova_compute[183083]: 2026-01-26 09:09:54.044 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:55 compute-1 sudo[224492]: pam_unix(sudo:session): session closed for user root
Jan 26 09:09:58 compute-1 nova_compute[183083]: 2026-01-26 09:09:58.369 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:09:58 compute-1 podman[225595]: 2026-01-26 09:09:58.822111202 +0000 UTC m=+0.069892216 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 26 09:09:58 compute-1 podman[225594]: 2026-01-26 09:09:58.822334068 +0000 UTC m=+0.069028151 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, config_id=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6)
Jan 26 09:09:58 compute-1 podman[225593]: 2026-01-26 09:09:58.84703469 +0000 UTC m=+0.091740967 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 26 09:09:58 compute-1 podman[225592]: 2026-01-26 09:09:58.882063715 +0000 UTC m=+0.136068796 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 09:09:58 compute-1 podman[225596]: 2026-01-26 09:09:58.882371674 +0000 UTC m=+0.123059367 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:09:59 compute-1 nova_compute[183083]: 2026-01-26 09:09:59.046 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:03 compute-1 nova_compute[183083]: 2026-01-26 09:10:03.372 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:10:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:10:04 compute-1 nova_compute[183083]: 2026-01-26 09:10:04.047 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:04 compute-1 nova_compute[183083]: 2026-01-26 09:10:04.495 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "caf06a69-97a3-459f-80ee-d3e792033a7d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:10:04 compute-1 nova_compute[183083]: 2026-01-26 09:10:04.496 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "caf06a69-97a3-459f-80ee-d3e792033a7d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:10:04 compute-1 nova_compute[183083]: 2026-01-26 09:10:04.515 183087 DEBUG nova.compute.manager [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 09:10:04 compute-1 nova_compute[183083]: 2026-01-26 09:10:04.585 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:10:04 compute-1 nova_compute[183083]: 2026-01-26 09:10:04.586 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:10:04 compute-1 nova_compute[183083]: 2026-01-26 09:10:04.592 183087 DEBUG nova.virt.hardware [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 09:10:04 compute-1 nova_compute[183083]: 2026-01-26 09:10:04.592 183087 INFO nova.compute.claims [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Claim successful on node compute-1.ctlplane.example.com
Jan 26 09:10:04 compute-1 nova_compute[183083]: 2026-01-26 09:10:04.682 183087 DEBUG nova.compute.provider_tree [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:10:04 compute-1 nova_compute[183083]: 2026-01-26 09:10:04.701 183087 DEBUG nova.scheduler.client.report [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:10:04 compute-1 nova_compute[183083]: 2026-01-26 09:10:04.736 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:10:04 compute-1 nova_compute[183083]: 2026-01-26 09:10:04.736 183087 DEBUG nova.compute.manager [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 09:10:04 compute-1 nova_compute[183083]: 2026-01-26 09:10:04.782 183087 DEBUG nova.compute.manager [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 09:10:04 compute-1 nova_compute[183083]: 2026-01-26 09:10:04.783 183087 DEBUG nova.network.neutron [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 09:10:04 compute-1 nova_compute[183083]: 2026-01-26 09:10:04.800 183087 INFO nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 09:10:04 compute-1 nova_compute[183083]: 2026-01-26 09:10:04.822 183087 DEBUG nova.compute.manager [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.017 183087 DEBUG nova.compute.manager [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.019 183087 DEBUG nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.019 183087 INFO nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Creating image(s)
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.020 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "/var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.020 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "/var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.021 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "/var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.035 183087 DEBUG oslo_concurrency.processutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.095 183087 DEBUG oslo_concurrency.processutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.096 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.097 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.108 183087 DEBUG oslo_concurrency.processutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.177 183087 DEBUG oslo_concurrency.processutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.178 183087 DEBUG oslo_concurrency.processutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.217 183087 DEBUG oslo_concurrency.processutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.219 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.219 183087 DEBUG oslo_concurrency.processutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.282 183087 DEBUG oslo_concurrency.processutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.284 183087 DEBUG nova.virt.disk.api [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Checking if we can resize image /var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.284 183087 DEBUG oslo_concurrency.processutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:10:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:05.325 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:10:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:05.325 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:10:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:05.325 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.351 183087 DEBUG oslo_concurrency.processutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.353 183087 DEBUG nova.virt.disk.api [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Cannot resize image /var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.353 183087 DEBUG nova.objects.instance [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lazy-loading 'migration_context' on Instance uuid caf06a69-97a3-459f-80ee-d3e792033a7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.369 183087 DEBUG nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.369 183087 DEBUG nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Ensure instance console log exists: /var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.370 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.371 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.372 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:10:05 compute-1 ovn_controller[95352]: 2026-01-26T09:10:05Z|00319|pinctrl|WARN|Dropped 1017 log messages in last 49 seconds (most recently, 7 seconds ago) due to excessive rate
Jan 26 09:10:05 compute-1 ovn_controller[95352]: 2026-01-26T09:10:05Z|00320|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:10:05 compute-1 nova_compute[183083]: 2026-01-26 09:10:05.921 183087 DEBUG nova.network.neutron [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Successfully created port: 5e7c63f6-ad56-4c99-825c-91547862fe78 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 09:10:06 compute-1 nova_compute[183083]: 2026-01-26 09:10:06.854 183087 DEBUG nova.network.neutron [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Successfully updated port: 5e7c63f6-ad56-4c99-825c-91547862fe78 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 09:10:06 compute-1 nova_compute[183083]: 2026-01-26 09:10:06.868 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "refresh_cache-caf06a69-97a3-459f-80ee-d3e792033a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:10:06 compute-1 nova_compute[183083]: 2026-01-26 09:10:06.868 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquired lock "refresh_cache-caf06a69-97a3-459f-80ee-d3e792033a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:10:06 compute-1 nova_compute[183083]: 2026-01-26 09:10:06.869 183087 DEBUG nova.network.neutron [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 09:10:06 compute-1 nova_compute[183083]: 2026-01-26 09:10:06.946 183087 DEBUG nova.compute.manager [req-f7dd169a-98ee-47b8-8e82-75651b49a790 req-34471abb-cd70-4ea7-ad4f-017da852c8d0 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Received event network-changed-5e7c63f6-ad56-4c99-825c-91547862fe78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:10:06 compute-1 nova_compute[183083]: 2026-01-26 09:10:06.947 183087 DEBUG nova.compute.manager [req-f7dd169a-98ee-47b8-8e82-75651b49a790 req-34471abb-cd70-4ea7-ad4f-017da852c8d0 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Refreshing instance network info cache due to event network-changed-5e7c63f6-ad56-4c99-825c-91547862fe78. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:10:06 compute-1 nova_compute[183083]: 2026-01-26 09:10:06.948 183087 DEBUG oslo_concurrency.lockutils [req-f7dd169a-98ee-47b8-8e82-75651b49a790 req-34471abb-cd70-4ea7-ad4f-017da852c8d0 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-caf06a69-97a3-459f-80ee-d3e792033a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.049 183087 DEBUG nova.network.neutron [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.764 183087 DEBUG nova.network.neutron [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Updating instance_info_cache with network_info: [{"id": "5e7c63f6-ad56-4c99-825c-91547862fe78", "address": "fa:16:3e:dc:d3:56", "network": {"id": "8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f", "bridge": "br-int", "label": "tempest-test-network--544222820", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7c63f6-ad", "ovs_interfaceid": "5e7c63f6-ad56-4c99-825c-91547862fe78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.783 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Releasing lock "refresh_cache-caf06a69-97a3-459f-80ee-d3e792033a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.784 183087 DEBUG nova.compute.manager [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Instance network_info: |[{"id": "5e7c63f6-ad56-4c99-825c-91547862fe78", "address": "fa:16:3e:dc:d3:56", "network": {"id": "8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f", "bridge": "br-int", "label": "tempest-test-network--544222820", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7c63f6-ad", "ovs_interfaceid": "5e7c63f6-ad56-4c99-825c-91547862fe78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.784 183087 DEBUG oslo_concurrency.lockutils [req-f7dd169a-98ee-47b8-8e82-75651b49a790 req-34471abb-cd70-4ea7-ad4f-017da852c8d0 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-caf06a69-97a3-459f-80ee-d3e792033a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.784 183087 DEBUG nova.network.neutron [req-f7dd169a-98ee-47b8-8e82-75651b49a790 req-34471abb-cd70-4ea7-ad4f-017da852c8d0 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Refreshing network info cache for port 5e7c63f6-ad56-4c99-825c-91547862fe78 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.787 183087 DEBUG nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Start _get_guest_xml network_info=[{"id": "5e7c63f6-ad56-4c99-825c-91547862fe78", "address": "fa:16:3e:dc:d3:56", "network": {"id": "8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f", "bridge": "br-int", "label": "tempest-test-network--544222820", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7c63f6-ad", "ovs_interfaceid": "5e7c63f6-ad56-4c99-825c-91547862fe78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encryption_options': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.795 183087 WARNING nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.805 183087 DEBUG nova.virt.libvirt.host [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.807 183087 DEBUG nova.virt.libvirt.host [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.813 183087 DEBUG nova.virt.libvirt.host [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.814 183087 DEBUG nova.virt.libvirt.host [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.814 183087 DEBUG nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.815 183087 DEBUG nova.virt.hardware [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T08:43:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7017323-cd35-470d-a718-faa6c6e97277',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.815 183087 DEBUG nova.virt.hardware [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.816 183087 DEBUG nova.virt.hardware [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.816 183087 DEBUG nova.virt.hardware [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.816 183087 DEBUG nova.virt.hardware [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.816 183087 DEBUG nova.virt.hardware [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.817 183087 DEBUG nova.virt.hardware [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.817 183087 DEBUG nova.virt.hardware [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.817 183087 DEBUG nova.virt.hardware [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.817 183087 DEBUG nova.virt.hardware [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.818 183087 DEBUG nova.virt.hardware [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.823 183087 DEBUG nova.virt.libvirt.vif [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T09:10:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-server-test-1922732182',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1922732182',id=52,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGl+Gle00pU/jD0+QTgQtaSfZijWNNLY/VOWwBfEgg38ntppIErSiSRh7a0jcapqJMUbQt1KSpnwAxxL7S1JSqd7N3DHSKguH0EvP3E8Ef2t7vqe0lXcTwp8rO9Yk3w5kg==',key_name='tempest-keypair-738008376',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2580bb16c90849c4b5919eb271774a06',ramdisk_id='',reservation_id='r-0skpa0zu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDvrTest-691788706',owner_user_name='tempest-OvnDvrTest-691788706-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:10:04Z,user_data=None,user_id='90104736f4ab4d81b09d1ff11e40f454',uuid=caf06a69-97a3-459f-80ee-d3e792033a7d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5e7c63f6-ad56-4c99-825c-91547862fe78", "address": "fa:16:3e:dc:d3:56", "network": {"id": "8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f", "bridge": "br-int", "label": "tempest-test-network--544222820", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7c63f6-ad", "ovs_interfaceid": "5e7c63f6-ad56-4c99-825c-91547862fe78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.823 183087 DEBUG nova.network.os_vif_util [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converting VIF {"id": "5e7c63f6-ad56-4c99-825c-91547862fe78", "address": "fa:16:3e:dc:d3:56", "network": {"id": "8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f", "bridge": "br-int", "label": "tempest-test-network--544222820", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7c63f6-ad", "ovs_interfaceid": "5e7c63f6-ad56-4c99-825c-91547862fe78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.824 183087 DEBUG nova.network.os_vif_util [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dc:d3:56,bridge_name='br-int',has_traffic_filtering=True,id=5e7c63f6-ad56-4c99-825c-91547862fe78,network=Network(8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e7c63f6-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.825 183087 DEBUG nova.objects.instance [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lazy-loading 'pci_devices' on Instance uuid caf06a69-97a3-459f-80ee-d3e792033a7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.844 183087 DEBUG nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] End _get_guest_xml xml=<domain type="kvm">
Jan 26 09:10:07 compute-1 nova_compute[183083]:   <uuid>caf06a69-97a3-459f-80ee-d3e792033a7d</uuid>
Jan 26 09:10:07 compute-1 nova_compute[183083]:   <name>instance-00000034</name>
Jan 26 09:10:07 compute-1 nova_compute[183083]:   <memory>131072</memory>
Jan 26 09:10:07 compute-1 nova_compute[183083]:   <vcpu>1</vcpu>
Jan 26 09:10:07 compute-1 nova_compute[183083]:   <metadata>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <nova:name>tempest-server-test-1922732182</nova:name>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <nova:creationTime>2026-01-26 09:10:07</nova:creationTime>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <nova:flavor name="m1.nano">
Jan 26 09:10:07 compute-1 nova_compute[183083]:         <nova:memory>128</nova:memory>
Jan 26 09:10:07 compute-1 nova_compute[183083]:         <nova:disk>1</nova:disk>
Jan 26 09:10:07 compute-1 nova_compute[183083]:         <nova:swap>0</nova:swap>
Jan 26 09:10:07 compute-1 nova_compute[183083]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 09:10:07 compute-1 nova_compute[183083]:         <nova:vcpus>1</nova:vcpus>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       </nova:flavor>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <nova:owner>
Jan 26 09:10:07 compute-1 nova_compute[183083]:         <nova:user uuid="90104736f4ab4d81b09d1ff11e40f454">tempest-OvnDvrTest-691788706-project-admin</nova:user>
Jan 26 09:10:07 compute-1 nova_compute[183083]:         <nova:project uuid="2580bb16c90849c4b5919eb271774a06">tempest-OvnDvrTest-691788706</nova:project>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       </nova:owner>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <nova:root type="image" uuid="13d1a20a-8003-4f19-aba7-ccbd9eff9b82"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <nova:ports>
Jan 26 09:10:07 compute-1 nova_compute[183083]:         <nova:port uuid="5e7c63f6-ad56-4c99-825c-91547862fe78">
Jan 26 09:10:07 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="10.100.0.70" ipVersion="4"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       </nova:ports>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     </nova:instance>
Jan 26 09:10:07 compute-1 nova_compute[183083]:   </metadata>
Jan 26 09:10:07 compute-1 nova_compute[183083]:   <sysinfo type="smbios">
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <system>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <entry name="manufacturer">RDO</entry>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <entry name="product">OpenStack Compute</entry>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <entry name="serial">caf06a69-97a3-459f-80ee-d3e792033a7d</entry>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <entry name="uuid">caf06a69-97a3-459f-80ee-d3e792033a7d</entry>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <entry name="family">Virtual Machine</entry>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     </system>
Jan 26 09:10:07 compute-1 nova_compute[183083]:   </sysinfo>
Jan 26 09:10:07 compute-1 nova_compute[183083]:   <os>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <boot dev="hd"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <smbios mode="sysinfo"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:   </os>
Jan 26 09:10:07 compute-1 nova_compute[183083]:   <features>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <acpi/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <apic/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <vmcoreinfo/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:   </features>
Jan 26 09:10:07 compute-1 nova_compute[183083]:   <clock offset="utc">
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <timer name="hpet" present="no"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:   </clock>
Jan 26 09:10:07 compute-1 nova_compute[183083]:   <cpu mode="host-model" match="exact">
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:   </cpu>
Jan 26 09:10:07 compute-1 nova_compute[183083]:   <devices>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <disk type="file" device="disk">
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/disk"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <target dev="vda" bus="virtio"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     </disk>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <disk type="file" device="cdrom">
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/disk.config"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <target dev="sda" bus="sata"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     </disk>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:dc:d3:56"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <target dev="tap5e7c63f6-ad"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     </interface>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <serial type="pty">
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <log file="/var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/console.log" append="off"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     </serial>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <video>
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     </video>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <input type="tablet" bus="usb"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <rng model="virtio">
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <backend model="random">/dev/urandom</backend>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     </rng>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <controller type="usb" index="0"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     <memballoon model="virtio">
Jan 26 09:10:07 compute-1 nova_compute[183083]:       <stats period="10"/>
Jan 26 09:10:07 compute-1 nova_compute[183083]:     </memballoon>
Jan 26 09:10:07 compute-1 nova_compute[183083]:   </devices>
Jan 26 09:10:07 compute-1 nova_compute[183083]: </domain>
Jan 26 09:10:07 compute-1 nova_compute[183083]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.846 183087 DEBUG nova.compute.manager [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Preparing to wait for external event network-vif-plugged-5e7c63f6-ad56-4c99-825c-91547862fe78 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.847 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "caf06a69-97a3-459f-80ee-d3e792033a7d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.848 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "caf06a69-97a3-459f-80ee-d3e792033a7d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.848 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "caf06a69-97a3-459f-80ee-d3e792033a7d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.849 183087 DEBUG nova.virt.libvirt.vif [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T09:10:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-server-test-1922732182',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1922732182',id=52,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGl+Gle00pU/jD0+QTgQtaSfZijWNNLY/VOWwBfEgg38ntppIErSiSRh7a0jcapqJMUbQt1KSpnwAxxL7S1JSqd7N3DHSKguH0EvP3E8Ef2t7vqe0lXcTwp8rO9Yk3w5kg==',key_name='tempest-keypair-738008376',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2580bb16c90849c4b5919eb271774a06',ramdisk_id='',reservation_id='r-0skpa0zu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDvrTest-691788706',owner_user_name='tempest-OvnDvrTest-691788706-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:10:04Z,user_data=None,user_id='90104736f4ab4d81b09d1ff11e40f454',uuid=caf06a69-97a3-459f-80ee-d3e792033a7d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5e7c63f6-ad56-4c99-825c-91547862fe78", "address": "fa:16:3e:dc:d3:56", "network": {"id": "8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f", "bridge": "br-int", "label": "tempest-test-network--544222820", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7c63f6-ad", "ovs_interfaceid": "5e7c63f6-ad56-4c99-825c-91547862fe78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.850 183087 DEBUG nova.network.os_vif_util [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converting VIF {"id": "5e7c63f6-ad56-4c99-825c-91547862fe78", "address": "fa:16:3e:dc:d3:56", "network": {"id": "8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f", "bridge": "br-int", "label": "tempest-test-network--544222820", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7c63f6-ad", "ovs_interfaceid": "5e7c63f6-ad56-4c99-825c-91547862fe78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.851 183087 DEBUG nova.network.os_vif_util [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dc:d3:56,bridge_name='br-int',has_traffic_filtering=True,id=5e7c63f6-ad56-4c99-825c-91547862fe78,network=Network(8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e7c63f6-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.852 183087 DEBUG os_vif [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dc:d3:56,bridge_name='br-int',has_traffic_filtering=True,id=5e7c63f6-ad56-4c99-825c-91547862fe78,network=Network(8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e7c63f6-ad') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.853 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.854 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.855 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.861 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.861 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5e7c63f6-ad, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.862 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5e7c63f6-ad, col_values=(('external_ids', {'iface-id': '5e7c63f6-ad56-4c99-825c-91547862fe78', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dc:d3:56', 'vm-uuid': 'caf06a69-97a3-459f-80ee-d3e792033a7d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.864 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:07 compute-1 NetworkManager[55451]: <info>  [1769418607.8658] manager: (tap5e7c63f6-ad): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/109)
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.867 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.873 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.875 183087 INFO os_vif [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dc:d3:56,bridge_name='br-int',has_traffic_filtering=True,id=5e7c63f6-ad56-4c99-825c-91547862fe78,network=Network(8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e7c63f6-ad')
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.926 183087 DEBUG nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.926 183087 DEBUG nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.927 183087 DEBUG nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] No VIF found with MAC fa:16:3e:dc:d3:56, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 09:10:07 compute-1 nova_compute[183083]: 2026-01-26 09:10:07.927 183087 INFO nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Using config drive
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.244 183087 INFO nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Creating config drive at /var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/disk.config
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.248 183087 DEBUG oslo_concurrency.processutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc8j1exkf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.375 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.377 183087 DEBUG oslo_concurrency.processutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc8j1exkf" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:10:08 compute-1 kernel: tap5e7c63f6-ad: entered promiscuous mode
Jan 26 09:10:08 compute-1 NetworkManager[55451]: <info>  [1769418608.4493] manager: (tap5e7c63f6-ad): new Tun device (/org/freedesktop/NetworkManager/Devices/110)
Jan 26 09:10:08 compute-1 ovn_controller[95352]: 2026-01-26T09:10:08Z|00321|binding|INFO|Claiming lport 5e7c63f6-ad56-4c99-825c-91547862fe78 for this chassis.
Jan 26 09:10:08 compute-1 ovn_controller[95352]: 2026-01-26T09:10:08Z|00322|binding|INFO|5e7c63f6-ad56-4c99-825c-91547862fe78: Claiming fa:16:3e:dc:d3:56 10.100.0.70
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.450 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.463 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:08 compute-1 ovn_controller[95352]: 2026-01-26T09:10:08Z|00323|binding|INFO|Setting lport 5e7c63f6-ad56-4c99-825c-91547862fe78 ovn-installed in OVS
Jan 26 09:10:08 compute-1 ovn_controller[95352]: 2026-01-26T09:10:08Z|00324|binding|INFO|Setting lport 5e7c63f6-ad56-4c99-825c-91547862fe78 up in Southbound
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.465 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.463 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dc:d3:56 10.100.0.70'], port_security=['fa:16:3e:dc:d3:56 10.100.0.70'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.70/28', 'neutron:device_id': 'caf06a69-97a3-459f-80ee-d3e792033a7d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2580bb16c90849c4b5919eb271774a06', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0e060529-90c0-4dee-9e6a-df027c8d1133', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0689409-2761-4236-9870-af8c2d36ab81, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=5e7c63f6-ad56-4c99-825c-91547862fe78) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.477 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.478 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 5e7c63f6-ad56-4c99-825c-91547862fe78 in datapath 8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f bound to our chassis
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.481 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.493 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[c2548378-8636-40ed-ac9e-5cc1a8fee9c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.494 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8c0c7436-51 in ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.496 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8c0c7436-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.496 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[ccd6d910-cc42-4f95-872d-d4f48af2aa8b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.498 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[58924de1-6d7a-402a-a34d-3d2a481b1190]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:08 compute-1 systemd-udevd[225733]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 09:10:08 compute-1 systemd-machined[154360]: New machine qemu-20-instance-00000034.
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.511 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[4e18919d-5d50-41a4-8110-b1a9bc1b4831]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:08 compute-1 systemd[1]: Started Virtual Machine qemu-20-instance-00000034.
Jan 26 09:10:08 compute-1 NetworkManager[55451]: <info>  [1769418608.5208] device (tap5e7c63f6-ad): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 09:10:08 compute-1 NetworkManager[55451]: <info>  [1769418608.5223] device (tap5e7c63f6-ad): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.540 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed5e844-6497-4dcc-bfd7-2a7d4db471a7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.566 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[5c601443-ddaa-419d-bc4c-9beae539c26b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.572 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[368fa9f2-e866-495e-ab3a-6d89065eecb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:08 compute-1 NetworkManager[55451]: <info>  [1769418608.5738] manager: (tap8c0c7436-50): new Veth device (/org/freedesktop/NetworkManager/Devices/111)
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.606 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[d50449ec-381d-48e8-b600-fc249e065e65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.609 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[04fb8e89-a10c-4292-aa9d-2b29dacec247]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:08 compute-1 NetworkManager[55451]: <info>  [1769418608.6353] device (tap8c0c7436-50): carrier: link connected
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.643 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[38a18f4a-572d-4c29-8c5c-0d666eeb3513]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.661 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[1f27d6e5-42b6-4e93-a5e0-9e1fc0eb7c18]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8c0c7436-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c5:17:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494924, 'reachable_time': 21517, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225765, 'error': None, 'target': 'ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.675 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[3efeff25-f5a0-4b11-b301-ef06d7b3a511]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec5:17a4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 494924, 'tstamp': 494924}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225766, 'error': None, 'target': 'ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.687 183087 DEBUG nova.compute.manager [req-b40b0e8a-4814-479a-9a93-cbd2e7483f32 req-dd8b8423-c7e4-4497-904c-68190932fe2d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Received event network-vif-plugged-5e7c63f6-ad56-4c99-825c-91547862fe78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.688 183087 DEBUG oslo_concurrency.lockutils [req-b40b0e8a-4814-479a-9a93-cbd2e7483f32 req-dd8b8423-c7e4-4497-904c-68190932fe2d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "caf06a69-97a3-459f-80ee-d3e792033a7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.688 183087 DEBUG oslo_concurrency.lockutils [req-b40b0e8a-4814-479a-9a93-cbd2e7483f32 req-dd8b8423-c7e4-4497-904c-68190932fe2d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "caf06a69-97a3-459f-80ee-d3e792033a7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.689 183087 DEBUG oslo_concurrency.lockutils [req-b40b0e8a-4814-479a-9a93-cbd2e7483f32 req-dd8b8423-c7e4-4497-904c-68190932fe2d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "caf06a69-97a3-459f-80ee-d3e792033a7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.689 183087 DEBUG nova.compute.manager [req-b40b0e8a-4814-479a-9a93-cbd2e7483f32 req-dd8b8423-c7e4-4497-904c-68190932fe2d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Processing event network-vif-plugged-5e7c63f6-ad56-4c99-825c-91547862fe78 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.691 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b9e4f0a0-67ce-412d-950a-ff728cd10fbf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8c0c7436-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c5:17:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494924, 'reachable_time': 21517, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225767, 'error': None, 'target': 'ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.736 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[83203ae6-0f42-477e-bbc8-eb6ca642ac15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.798 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[ad6ac997-006a-45ec-b240-5c01b2d06a56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.800 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c0c7436-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.800 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.800 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c0c7436-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.802 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:08 compute-1 NetworkManager[55451]: <info>  [1769418608.8031] manager: (tap8c0c7436-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/112)
Jan 26 09:10:08 compute-1 kernel: tap8c0c7436-50: entered promiscuous mode
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.805 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.806 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8c0c7436-50, col_values=(('external_ids', {'iface-id': 'fa97bed8-1bbd-479c-9b4e-e285941642c8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:10:08 compute-1 ovn_controller[95352]: 2026-01-26T09:10:08Z|00325|binding|INFO|Releasing lport fa97bed8-1bbd-479c-9b4e-e285941642c8 from this chassis (sb_readonly=0)
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.814 183087 DEBUG nova.network.neutron [req-f7dd169a-98ee-47b8-8e82-75651b49a790 req-34471abb-cd70-4ea7-ad4f-017da852c8d0 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Updated VIF entry in instance network info cache for port 5e7c63f6-ad56-4c99-825c-91547862fe78. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.814 183087 DEBUG nova.network.neutron [req-f7dd169a-98ee-47b8-8e82-75651b49a790 req-34471abb-cd70-4ea7-ad4f-017da852c8d0 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Updating instance_info_cache with network_info: [{"id": "5e7c63f6-ad56-4c99-825c-91547862fe78", "address": "fa:16:3e:dc:d3:56", "network": {"id": "8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f", "bridge": "br-int", "label": "tempest-test-network--544222820", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7c63f6-ad", "ovs_interfaceid": "5e7c63f6-ad56-4c99-825c-91547862fe78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.825 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.827 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.828 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[c33cdfff-1a73-465c-81f7-8edb7bdaa484]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.828 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: global
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f.pid.haproxy
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID 8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 09:10:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:08.829 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f', 'env', 'PROCESS_TAG=haproxy-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.830 183087 DEBUG oslo_concurrency.lockutils [req-f7dd169a-98ee-47b8-8e82-75651b49a790 req-34471abb-cd70-4ea7-ad4f-017da852c8d0 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-caf06a69-97a3-459f-80ee-d3e792033a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.938 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769418608.937786, caf06a69-97a3-459f-80ee-d3e792033a7d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.939 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] VM Started (Lifecycle Event)
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.941 183087 DEBUG nova.compute.manager [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.944 183087 DEBUG nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.947 183087 INFO nova.virt.libvirt.driver [-] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Instance spawned successfully.
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.947 183087 DEBUG nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.965 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.970 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.974 183087 DEBUG nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.974 183087 DEBUG nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.975 183087 DEBUG nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.975 183087 DEBUG nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.976 183087 DEBUG nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:10:08 compute-1 nova_compute[183083]: 2026-01-26 09:10:08.977 183087 DEBUG nova.virt.libvirt.driver [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:10:09 compute-1 nova_compute[183083]: 2026-01-26 09:10:09.007 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 09:10:09 compute-1 nova_compute[183083]: 2026-01-26 09:10:09.008 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769418608.9379327, caf06a69-97a3-459f-80ee-d3e792033a7d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:10:09 compute-1 nova_compute[183083]: 2026-01-26 09:10:09.008 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] VM Paused (Lifecycle Event)
Jan 26 09:10:09 compute-1 nova_compute[183083]: 2026-01-26 09:10:09.038 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:10:09 compute-1 nova_compute[183083]: 2026-01-26 09:10:09.041 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769418608.943734, caf06a69-97a3-459f-80ee-d3e792033a7d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:10:09 compute-1 nova_compute[183083]: 2026-01-26 09:10:09.042 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] VM Resumed (Lifecycle Event)
Jan 26 09:10:09 compute-1 nova_compute[183083]: 2026-01-26 09:10:09.051 183087 INFO nova.compute.manager [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Took 4.03 seconds to spawn the instance on the hypervisor.
Jan 26 09:10:09 compute-1 nova_compute[183083]: 2026-01-26 09:10:09.052 183087 DEBUG nova.compute.manager [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:10:09 compute-1 nova_compute[183083]: 2026-01-26 09:10:09.065 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:10:09 compute-1 nova_compute[183083]: 2026-01-26 09:10:09.068 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 09:10:09 compute-1 nova_compute[183083]: 2026-01-26 09:10:09.096 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 09:10:09 compute-1 nova_compute[183083]: 2026-01-26 09:10:09.118 183087 INFO nova.compute.manager [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Took 4.56 seconds to build instance.
Jan 26 09:10:09 compute-1 nova_compute[183083]: 2026-01-26 09:10:09.140 183087 DEBUG oslo_concurrency.lockutils [None req-93f196c9-232a-4a9f-889f-9aa802586398 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "caf06a69-97a3-459f-80ee-d3e792033a7d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:10:09 compute-1 podman[225803]: 2026-01-26 09:10:09.212438595 +0000 UTC m=+0.049624631 container create 4267aeca950b4fa868a345b7548c5c42e0037b0f013e47394fb19e38d11d783d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:10:09 compute-1 systemd[1]: Started libpod-conmon-4267aeca950b4fa868a345b7548c5c42e0037b0f013e47394fb19e38d11d783d.scope.
Jan 26 09:10:09 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:10:09 compute-1 podman[225803]: 2026-01-26 09:10:09.191192982 +0000 UTC m=+0.028379038 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 09:10:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8419451fe5ffba03735a0bd5767731c14058222756d1d7084e15ce65ef81052b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 09:10:09 compute-1 podman[225803]: 2026-01-26 09:10:09.304574452 +0000 UTC m=+0.141760518 container init 4267aeca950b4fa868a345b7548c5c42e0037b0f013e47394fb19e38d11d783d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 26 09:10:09 compute-1 podman[225803]: 2026-01-26 09:10:09.314295919 +0000 UTC m=+0.151481955 container start 4267aeca950b4fa868a345b7548c5c42e0037b0f013e47394fb19e38d11d783d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 26 09:10:09 compute-1 neutron-haproxy-ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f[225818]: [NOTICE]   (225822) : New worker (225824) forked
Jan 26 09:10:09 compute-1 neutron-haproxy-ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f[225818]: [NOTICE]   (225822) : Loading success.
Jan 26 09:10:10 compute-1 nova_compute[183083]: 2026-01-26 09:10:10.770 183087 DEBUG nova.compute.manager [req-4ef53f5e-9c76-429f-80cd-d0626fb8f4ac req-89243bf5-4c66-4a8e-aa6e-3156da12ffb1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Received event network-vif-plugged-5e7c63f6-ad56-4c99-825c-91547862fe78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:10:10 compute-1 nova_compute[183083]: 2026-01-26 09:10:10.772 183087 DEBUG oslo_concurrency.lockutils [req-4ef53f5e-9c76-429f-80cd-d0626fb8f4ac req-89243bf5-4c66-4a8e-aa6e-3156da12ffb1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "caf06a69-97a3-459f-80ee-d3e792033a7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:10:10 compute-1 nova_compute[183083]: 2026-01-26 09:10:10.773 183087 DEBUG oslo_concurrency.lockutils [req-4ef53f5e-9c76-429f-80cd-d0626fb8f4ac req-89243bf5-4c66-4a8e-aa6e-3156da12ffb1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "caf06a69-97a3-459f-80ee-d3e792033a7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:10:10 compute-1 nova_compute[183083]: 2026-01-26 09:10:10.773 183087 DEBUG oslo_concurrency.lockutils [req-4ef53f5e-9c76-429f-80cd-d0626fb8f4ac req-89243bf5-4c66-4a8e-aa6e-3156da12ffb1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "caf06a69-97a3-459f-80ee-d3e792033a7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:10:10 compute-1 nova_compute[183083]: 2026-01-26 09:10:10.773 183087 DEBUG nova.compute.manager [req-4ef53f5e-9c76-429f-80cd-d0626fb8f4ac req-89243bf5-4c66-4a8e-aa6e-3156da12ffb1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] No waiting events found dispatching network-vif-plugged-5e7c63f6-ad56-4c99-825c-91547862fe78 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:10:10 compute-1 nova_compute[183083]: 2026-01-26 09:10:10.773 183087 WARNING nova.compute.manager [req-4ef53f5e-9c76-429f-80cd-d0626fb8f4ac req-89243bf5-4c66-4a8e-aa6e-3156da12ffb1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Received unexpected event network-vif-plugged-5e7c63f6-ad56-4c99-825c-91547862fe78 for instance with vm_state active and task_state None.
Jan 26 09:10:12 compute-1 nova_compute[183083]: 2026-01-26 09:10:12.866 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:13 compute-1 nova_compute[183083]: 2026-01-26 09:10:13.416 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:13 compute-1 podman[225833]: 2026-01-26 09:10:13.837391581 +0000 UTC m=+0.094113733 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:10:14 compute-1 nova_compute[183083]: 2026-01-26 09:10:14.720 183087 DEBUG nova.compute.manager [req-6ccfa997-6dd3-4488-b664-c7887ad0ae66 req-394b280a-6589-4935-a4eb-84d27ac4dd6d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Received event network-changed-5e7c63f6-ad56-4c99-825c-91547862fe78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:10:14 compute-1 nova_compute[183083]: 2026-01-26 09:10:14.721 183087 DEBUG nova.compute.manager [req-6ccfa997-6dd3-4488-b664-c7887ad0ae66 req-394b280a-6589-4935-a4eb-84d27ac4dd6d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Refreshing instance network info cache due to event network-changed-5e7c63f6-ad56-4c99-825c-91547862fe78. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:10:14 compute-1 nova_compute[183083]: 2026-01-26 09:10:14.721 183087 DEBUG oslo_concurrency.lockutils [req-6ccfa997-6dd3-4488-b664-c7887ad0ae66 req-394b280a-6589-4935-a4eb-84d27ac4dd6d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-caf06a69-97a3-459f-80ee-d3e792033a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:10:14 compute-1 nova_compute[183083]: 2026-01-26 09:10:14.722 183087 DEBUG oslo_concurrency.lockutils [req-6ccfa997-6dd3-4488-b664-c7887ad0ae66 req-394b280a-6589-4935-a4eb-84d27ac4dd6d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-caf06a69-97a3-459f-80ee-d3e792033a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:10:14 compute-1 nova_compute[183083]: 2026-01-26 09:10:14.722 183087 DEBUG nova.network.neutron [req-6ccfa997-6dd3-4488-b664-c7887ad0ae66 req-394b280a-6589-4935-a4eb-84d27ac4dd6d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Refreshing network info cache for port 5e7c63f6-ad56-4c99-825c-91547862fe78 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:10:15 compute-1 sshd-session[225857]: Accepted publickey for zuul from 38.102.83.66 port 40082 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:10:15 compute-1 systemd-logind[788]: New session 114 of user zuul.
Jan 26 09:10:15 compute-1 systemd[1]: Started Session 114 of User zuul.
Jan 26 09:10:15 compute-1 sshd-session[225857]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:10:15 compute-1 sshd-session[225861]: Accepted publickey for zuul from 38.102.83.66 port 40084 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:10:15 compute-1 systemd-logind[788]: New session 115 of user zuul.
Jan 26 09:10:15 compute-1 systemd[1]: Started Session 115 of User zuul.
Jan 26 09:10:15 compute-1 sshd-session[225861]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:10:15 compute-1 sudo[225865]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:10:15 compute-1 sudo[225865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:10:15 compute-1 sudo[225865]: pam_unix(sudo:session): session closed for user root
Jan 26 09:10:16 compute-1 sudo[225890]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni eth1 icmp and ether host fa:16:3e:60:61:52 -w /tmp/tmp.13BTVS5XSc
Jan 26 09:10:16 compute-1 sudo[225890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:10:16 compute-1 nova_compute[183083]: 2026-01-26 09:10:16.081 183087 DEBUG nova.network.neutron [req-6ccfa997-6dd3-4488-b664-c7887ad0ae66 req-394b280a-6589-4935-a4eb-84d27ac4dd6d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Updated VIF entry in instance network info cache for port 5e7c63f6-ad56-4c99-825c-91547862fe78. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:10:16 compute-1 nova_compute[183083]: 2026-01-26 09:10:16.082 183087 DEBUG nova.network.neutron [req-6ccfa997-6dd3-4488-b664-c7887ad0ae66 req-394b280a-6589-4935-a4eb-84d27ac4dd6d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Updating instance_info_cache with network_info: [{"id": "5e7c63f6-ad56-4c99-825c-91547862fe78", "address": "fa:16:3e:dc:d3:56", "network": {"id": "8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f", "bridge": "br-int", "label": "tempest-test-network--544222820", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7c63f6-ad", "ovs_interfaceid": "5e7c63f6-ad56-4c99-825c-91547862fe78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:10:16 compute-1 sshd-session[225864]: Connection closed by 38.102.83.66 port 40084
Jan 26 09:10:16 compute-1 sshd-session[225861]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:10:16 compute-1 systemd-logind[788]: Session 115 logged out. Waiting for processes to exit.
Jan 26 09:10:16 compute-1 systemd[1]: session-115.scope: Deactivated successfully.
Jan 26 09:10:16 compute-1 systemd-logind[788]: Removed session 115.
Jan 26 09:10:16 compute-1 nova_compute[183083]: 2026-01-26 09:10:16.129 183087 DEBUG oslo_concurrency.lockutils [req-6ccfa997-6dd3-4488-b664-c7887ad0ae66 req-394b280a-6589-4935-a4eb-84d27ac4dd6d 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-caf06a69-97a3-459f-80ee-d3e792033a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:10:16 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 26 09:10:17 compute-1 nova_compute[183083]: 2026-01-26 09:10:17.870 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:18 compute-1 nova_compute[183083]: 2026-01-26 09:10:18.419 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:20 compute-1 nova_compute[183083]: 2026-01-26 09:10:20.292 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:10:20 compute-1 nova_compute[183083]: 2026-01-26 09:10:20.294 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:10:20 compute-1 nova_compute[183083]: 2026-01-26 09:10:20.294 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:10:20 compute-1 nova_compute[183083]: 2026-01-26 09:10:20.547 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "refresh_cache-caf06a69-97a3-459f-80ee-d3e792033a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:10:20 compute-1 nova_compute[183083]: 2026-01-26 09:10:20.548 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquired lock "refresh_cache-caf06a69-97a3-459f-80ee-d3e792033a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:10:20 compute-1 nova_compute[183083]: 2026-01-26 09:10:20.548 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 09:10:20 compute-1 nova_compute[183083]: 2026-01-26 09:10:20.548 183087 DEBUG nova.objects.instance [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lazy-loading 'info_cache' on Instance uuid caf06a69-97a3-459f-80ee-d3e792033a7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:10:21 compute-1 ovn_controller[95352]: 2026-01-26T09:10:21Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:dc:d3:56 10.100.0.70
Jan 26 09:10:21 compute-1 ovn_controller[95352]: 2026-01-26T09:10:21Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dc:d3:56 10.100.0.70
Jan 26 09:10:21 compute-1 nova_compute[183083]: 2026-01-26 09:10:21.685 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Updating instance_info_cache with network_info: [{"id": "5e7c63f6-ad56-4c99-825c-91547862fe78", "address": "fa:16:3e:dc:d3:56", "network": {"id": "8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f", "bridge": "br-int", "label": "tempest-test-network--544222820", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7c63f6-ad", "ovs_interfaceid": "5e7c63f6-ad56-4c99-825c-91547862fe78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:10:21 compute-1 nova_compute[183083]: 2026-01-26 09:10:21.704 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Releasing lock "refresh_cache-caf06a69-97a3-459f-80ee-d3e792033a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:10:21 compute-1 nova_compute[183083]: 2026-01-26 09:10:21.704 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 09:10:21 compute-1 nova_compute[183083]: 2026-01-26 09:10:21.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:10:22 compute-1 nova_compute[183083]: 2026-01-26 09:10:22.875 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:23 compute-1 sudo[224784]: pam_unix(sudo:session): session closed for user root
Jan 26 09:10:23 compute-1 nova_compute[183083]: 2026-01-26 09:10:23.420 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:24 compute-1 nova_compute[183083]: 2026-01-26 09:10:24.946 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:10:24 compute-1 nova_compute[183083]: 2026-01-26 09:10:24.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:10:24 compute-1 nova_compute[183083]: 2026-01-26 09:10:24.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:10:24 compute-1 nova_compute[183083]: 2026-01-26 09:10:24.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:10:25 compute-1 nova_compute[183083]: 2026-01-26 09:10:25.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:10:27 compute-1 nova_compute[183083]: 2026-01-26 09:10:27.879 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:27 compute-1 nova_compute[183083]: 2026-01-26 09:10:27.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:10:27 compute-1 nova_compute[183083]: 2026-01-26 09:10:27.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:10:28 compute-1 nova_compute[183083]: 2026-01-26 09:10:28.424 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:29 compute-1 podman[225932]: 2026-01-26 09:10:29.835767539 +0000 UTC m=+0.087171777 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 09:10:29 compute-1 podman[225933]: 2026-01-26 09:10:29.840248876 +0000 UTC m=+0.072630674 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 09:10:29 compute-1 podman[225944]: 2026-01-26 09:10:29.842252573 +0000 UTC m=+0.075309710 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:10:29 compute-1 podman[225931]: 2026-01-26 09:10:29.853936505 +0000 UTC m=+0.109007067 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 26 09:10:29 compute-1 podman[225930]: 2026-01-26 09:10:29.883541946 +0000 UTC m=+0.135010496 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 26 09:10:29 compute-1 nova_compute[183083]: 2026-01-26 09:10:29.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:10:29 compute-1 nova_compute[183083]: 2026-01-26 09:10:29.975 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:10:29 compute-1 nova_compute[183083]: 2026-01-26 09:10:29.976 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:10:29 compute-1 nova_compute[183083]: 2026-01-26 09:10:29.976 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:10:29 compute-1 nova_compute[183083]: 2026-01-26 09:10:29.976 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.064 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.163 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.165 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.230 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.452 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.455 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13408MB free_disk=113.05505752563477GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.455 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.455 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.528 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance caf06a69-97a3-459f-80ee-d3e792033a7d actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.529 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.529 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.551 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing inventories for resource provider 5203935e-446c-4e03-93fa-4c60d651e045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.568 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating ProviderTree inventory for provider 5203935e-446c-4e03-93fa-4c60d651e045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.568 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating inventory in ProviderTree for provider 5203935e-446c-4e03-93fa-4c60d651e045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.585 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing aggregate associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.603 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing trait associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.645 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.660 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.688 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:10:30 compute-1 nova_compute[183083]: 2026-01-26 09:10:30.689 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:10:32 compute-1 nova_compute[183083]: 2026-01-26 09:10:32.883 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:33 compute-1 sshd-session[226038]: Accepted publickey for zuul from 38.102.83.66 port 51974 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:10:33 compute-1 systemd-logind[788]: New session 116 of user zuul.
Jan 26 09:10:33 compute-1 systemd[1]: Started Session 116 of User zuul.
Jan 26 09:10:33 compute-1 sshd-session[226038]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:10:33 compute-1 sudo[226042]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.13BTVS5XSc
Jan 26 09:10:33 compute-1 sudo[226042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:10:33 compute-1 sudo[226042]: pam_unix(sudo:session): session closed for user root
Jan 26 09:10:33 compute-1 nova_compute[183083]: 2026-01-26 09:10:33.428 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:34 compute-1 sshd-session[226068]: Invalid user ubuntu from 2.57.122.238 port 57976
Jan 26 09:10:34 compute-1 sshd-session[226068]: Connection closed by invalid user ubuntu 2.57.122.238 port 57976 [preauth]
Jan 26 09:10:34 compute-1 sudo[224982]: pam_unix(sudo:session): session closed for user root
Jan 26 09:10:37 compute-1 nova_compute[183083]: 2026-01-26 09:10:37.885 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:38 compute-1 nova_compute[183083]: 2026-01-26 09:10:38.430 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:39 compute-1 sshd-session[226070]: Accepted publickey for zuul from 38.102.83.66 port 41912 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:10:39 compute-1 systemd-logind[788]: New session 117 of user zuul.
Jan 26 09:10:39 compute-1 systemd[1]: Started Session 117 of User zuul.
Jan 26 09:10:39 compute-1 sshd-session[226070]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:10:39 compute-1 sudo[226074]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.13BTVS5XSc
Jan 26 09:10:39 compute-1 sudo[226074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:10:39 compute-1 sudo[226074]: pam_unix(sudo:session): session closed for user root
Jan 26 09:10:39 compute-1 sshd-session[226073]: Connection closed by 38.102.83.66 port 41912
Jan 26 09:10:39 compute-1 sshd-session[226070]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:10:39 compute-1 systemd[1]: session-117.scope: Deactivated successfully.
Jan 26 09:10:39 compute-1 systemd-logind[788]: Session 117 logged out. Waiting for processes to exit.
Jan 26 09:10:39 compute-1 systemd-logind[788]: Removed session 117.
Jan 26 09:10:42 compute-1 nova_compute[183083]: 2026-01-26 09:10:42.889 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:43 compute-1 nova_compute[183083]: 2026-01-26 09:10:43.433 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:43 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:43.547 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:10:43 compute-1 nova_compute[183083]: 2026-01-26 09:10:43.547 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:43 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:43.550 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:10:43 compute-1 nova_compute[183083]: 2026-01-26 09:10:43.814 183087 DEBUG nova.compute.manager [req-a96b2420-9713-47e5-8b11-4a2ee81f1426 req-10479a4e-1a54-4c99-ac7b-1e388135dbd1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Received event network-changed-5e7c63f6-ad56-4c99-825c-91547862fe78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:10:43 compute-1 nova_compute[183083]: 2026-01-26 09:10:43.814 183087 DEBUG nova.compute.manager [req-a96b2420-9713-47e5-8b11-4a2ee81f1426 req-10479a4e-1a54-4c99-ac7b-1e388135dbd1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Refreshing instance network info cache due to event network-changed-5e7c63f6-ad56-4c99-825c-91547862fe78. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:10:43 compute-1 nova_compute[183083]: 2026-01-26 09:10:43.815 183087 DEBUG oslo_concurrency.lockutils [req-a96b2420-9713-47e5-8b11-4a2ee81f1426 req-10479a4e-1a54-4c99-ac7b-1e388135dbd1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-caf06a69-97a3-459f-80ee-d3e792033a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:10:43 compute-1 nova_compute[183083]: 2026-01-26 09:10:43.815 183087 DEBUG oslo_concurrency.lockutils [req-a96b2420-9713-47e5-8b11-4a2ee81f1426 req-10479a4e-1a54-4c99-ac7b-1e388135dbd1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-caf06a69-97a3-459f-80ee-d3e792033a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:10:43 compute-1 nova_compute[183083]: 2026-01-26 09:10:43.816 183087 DEBUG nova.network.neutron [req-a96b2420-9713-47e5-8b11-4a2ee81f1426 req-10479a4e-1a54-4c99-ac7b-1e388135dbd1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Refreshing network info cache for port 5e7c63f6-ad56-4c99-825c-91547862fe78 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:10:43 compute-1 nova_compute[183083]: 2026-01-26 09:10:43.870 183087 DEBUG oslo_concurrency.lockutils [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "caf06a69-97a3-459f-80ee-d3e792033a7d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:10:43 compute-1 nova_compute[183083]: 2026-01-26 09:10:43.871 183087 DEBUG oslo_concurrency.lockutils [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "caf06a69-97a3-459f-80ee-d3e792033a7d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:10:43 compute-1 nova_compute[183083]: 2026-01-26 09:10:43.871 183087 DEBUG oslo_concurrency.lockutils [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "caf06a69-97a3-459f-80ee-d3e792033a7d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:10:43 compute-1 nova_compute[183083]: 2026-01-26 09:10:43.871 183087 DEBUG oslo_concurrency.lockutils [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "caf06a69-97a3-459f-80ee-d3e792033a7d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:10:43 compute-1 nova_compute[183083]: 2026-01-26 09:10:43.871 183087 DEBUG oslo_concurrency.lockutils [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "caf06a69-97a3-459f-80ee-d3e792033a7d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:10:43 compute-1 nova_compute[183083]: 2026-01-26 09:10:43.872 183087 INFO nova.compute.manager [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Terminating instance
Jan 26 09:10:43 compute-1 nova_compute[183083]: 2026-01-26 09:10:43.873 183087 DEBUG nova.compute.manager [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 09:10:43 compute-1 kernel: tap5e7c63f6-ad (unregistering): left promiscuous mode
Jan 26 09:10:43 compute-1 NetworkManager[55451]: <info>  [1769418643.8925] device (tap5e7c63f6-ad): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 09:10:43 compute-1 ovn_controller[95352]: 2026-01-26T09:10:43Z|00326|binding|INFO|Releasing lport 5e7c63f6-ad56-4c99-825c-91547862fe78 from this chassis (sb_readonly=0)
Jan 26 09:10:43 compute-1 ovn_controller[95352]: 2026-01-26T09:10:43Z|00327|binding|INFO|Setting lport 5e7c63f6-ad56-4c99-825c-91547862fe78 down in Southbound
Jan 26 09:10:43 compute-1 ovn_controller[95352]: 2026-01-26T09:10:43Z|00328|binding|INFO|Removing iface tap5e7c63f6-ad ovn-installed in OVS
Jan 26 09:10:43 compute-1 nova_compute[183083]: 2026-01-26 09:10:43.900 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:43 compute-1 nova_compute[183083]: 2026-01-26 09:10:43.902 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:43 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:43.906 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dc:d3:56 10.100.0.70'], port_security=['fa:16:3e:dc:d3:56 10.100.0.70'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.70/28', 'neutron:device_id': 'caf06a69-97a3-459f-80ee-d3e792033a7d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2580bb16c90849c4b5919eb271774a06', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0e060529-90c0-4dee-9e6a-df027c8d1133', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0689409-2761-4236-9870-af8c2d36ab81, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=5e7c63f6-ad56-4c99-825c-91547862fe78) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:10:43 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:43.908 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 5e7c63f6-ad56-4c99-825c-91547862fe78 in datapath 8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f unbound from our chassis
Jan 26 09:10:43 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:43.911 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 09:10:43 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:43.914 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[a4cb1bb0-de94-49da-b614-68dc23ac2fe0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:43 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:43.915 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f namespace which is not needed anymore
Jan 26 09:10:43 compute-1 nova_compute[183083]: 2026-01-26 09:10:43.918 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:43 compute-1 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000034.scope: Deactivated successfully.
Jan 26 09:10:43 compute-1 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000034.scope: Consumed 12.930s CPU time.
Jan 26 09:10:43 compute-1 systemd-machined[154360]: Machine qemu-20-instance-00000034 terminated.
Jan 26 09:10:43 compute-1 podman[226102]: 2026-01-26 09:10:43.997311594 +0000 UTC m=+0.065858982 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 09:10:44 compute-1 neutron-haproxy-ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f[225818]: [NOTICE]   (225822) : haproxy version is 2.8.14-c23fe91
Jan 26 09:10:44 compute-1 neutron-haproxy-ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f[225818]: [NOTICE]   (225822) : path to executable is /usr/sbin/haproxy
Jan 26 09:10:44 compute-1 neutron-haproxy-ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f[225818]: [WARNING]  (225822) : Exiting Master process...
Jan 26 09:10:44 compute-1 neutron-haproxy-ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f[225818]: [ALERT]    (225822) : Current worker (225824) exited with code 143 (Terminated)
Jan 26 09:10:44 compute-1 neutron-haproxy-ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f[225818]: [WARNING]  (225822) : All workers exited. Exiting... (0)
Jan 26 09:10:44 compute-1 systemd[1]: libpod-4267aeca950b4fa868a345b7548c5c42e0037b0f013e47394fb19e38d11d783d.scope: Deactivated successfully.
Jan 26 09:10:44 compute-1 podman[226145]: 2026-01-26 09:10:44.046331926 +0000 UTC m=+0.042776176 container died 4267aeca950b4fa868a345b7548c5c42e0037b0f013e47394fb19e38d11d783d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 26 09:10:44 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4267aeca950b4fa868a345b7548c5c42e0037b0f013e47394fb19e38d11d783d-userdata-shm.mount: Deactivated successfully.
Jan 26 09:10:44 compute-1 systemd[1]: var-lib-containers-storage-overlay-8419451fe5ffba03735a0bd5767731c14058222756d1d7084e15ce65ef81052b-merged.mount: Deactivated successfully.
Jan 26 09:10:44 compute-1 podman[226145]: 2026-01-26 09:10:44.079443867 +0000 UTC m=+0.075888117 container cleanup 4267aeca950b4fa868a345b7548c5c42e0037b0f013e47394fb19e38d11d783d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.096 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.101 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:44 compute-1 systemd[1]: libpod-conmon-4267aeca950b4fa868a345b7548c5c42e0037b0f013e47394fb19e38d11d783d.scope: Deactivated successfully.
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.144 183087 INFO nova.virt.libvirt.driver [-] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Instance destroyed successfully.
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.144 183087 DEBUG nova.objects.instance [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lazy-loading 'resources' on Instance uuid caf06a69-97a3-459f-80ee-d3e792033a7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:10:44 compute-1 podman[226179]: 2026-01-26 09:10:44.148138278 +0000 UTC m=+0.047139820 container remove 4267aeca950b4fa868a345b7548c5c42e0037b0f013e47394fb19e38d11d783d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 09:10:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:44.155 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef6a0ce-b86e-4a5b-b7d8-d1932921736f]: (4, ('Mon Jan 26 09:10:43 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f (4267aeca950b4fa868a345b7548c5c42e0037b0f013e47394fb19e38d11d783d)\n4267aeca950b4fa868a345b7548c5c42e0037b0f013e47394fb19e38d11d783d\nMon Jan 26 09:10:44 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f (4267aeca950b4fa868a345b7548c5c42e0037b0f013e47394fb19e38d11d783d)\n4267aeca950b4fa868a345b7548c5c42e0037b0f013e47394fb19e38d11d783d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:44.156 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[94170384-7942-48a3-ad5a-8351fec4f654]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:44.158 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c0c7436-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.159 183087 DEBUG nova.virt.libvirt.vif [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T09:10:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-1922732182',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1922732182',id=52,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGl+Gle00pU/jD0+QTgQtaSfZijWNNLY/VOWwBfEgg38ntppIErSiSRh7a0jcapqJMUbQt1KSpnwAxxL7S1JSqd7N3DHSKguH0EvP3E8Ef2t7vqe0lXcTwp8rO9Yk3w5kg==',key_name='tempest-keypair-738008376',keypairs=<?>,launch_index=0,launched_at=2026-01-26T09:10:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2580bb16c90849c4b5919eb271774a06',ramdisk_id='',reservation_id='r-0skpa0zu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-691788706',owner_user_name='tempest-OvnDvrTest-691788706-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T09:10:09Z,user_data=None,user_id='90104736f4ab4d81b09d1ff11e40f454',uuid=caf06a69-97a3-459f-80ee-d3e792033a7d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5e7c63f6-ad56-4c99-825c-91547862fe78", "address": "fa:16:3e:dc:d3:56", "network": {"id": "8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f", "bridge": "br-int", "label": "tempest-test-network--544222820", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7c63f6-ad", "ovs_interfaceid": "5e7c63f6-ad56-4c99-825c-91547862fe78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 09:10:44 compute-1 kernel: tap8c0c7436-50: left promiscuous mode
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.160 183087 DEBUG nova.network.os_vif_util [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converting VIF {"id": "5e7c63f6-ad56-4c99-825c-91547862fe78", "address": "fa:16:3e:dc:d3:56", "network": {"id": "8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f", "bridge": "br-int", "label": "tempest-test-network--544222820", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7c63f6-ad", "ovs_interfaceid": "5e7c63f6-ad56-4c99-825c-91547862fe78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.162 183087 DEBUG nova.network.os_vif_util [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dc:d3:56,bridge_name='br-int',has_traffic_filtering=True,id=5e7c63f6-ad56-4c99-825c-91547862fe78,network=Network(8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e7c63f6-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.163 183087 DEBUG os_vif [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dc:d3:56,bridge_name='br-int',has_traffic_filtering=True,id=5e7c63f6-ad56-4c99-825c-91547862fe78,network=Network(8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e7c63f6-ad') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.166 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.166 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e7c63f6-ad, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.167 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.168 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.174 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.174 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.177 183087 INFO os_vif [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dc:d3:56,bridge_name='br-int',has_traffic_filtering=True,id=5e7c63f6-ad56-4c99-825c-91547862fe78,network=Network(8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e7c63f6-ad')
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.177 183087 INFO nova.virt.libvirt.driver [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Deleting instance files /var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d_del
Jan 26 09:10:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:44.178 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[87079595-35cf-4aee-88cc-52fac5bb8dd6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.178 183087 INFO nova.virt.libvirt.driver [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Deletion of /var/lib/nova/instances/caf06a69-97a3-459f-80ee-d3e792033a7d_del complete
Jan 26 09:10:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:44.200 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[74c8a384-92da-4962-ac51-a0194d49175b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:44.202 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[1807d34c-10f0-4499-8108-31619d1dbdde]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:44.219 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[5f912e90-42b8-4edf-9d8c-6dd4ad31c8f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494916, 'reachable_time': 19564, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226210, 'error': None, 'target': 'ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:44.222 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 09:10:44 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:44.222 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[5a8cb40e-b435-4936-b406-70bde6a2398e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:10:44 compute-1 systemd[1]: run-netns-ovnmeta\x2d8c0c7436\x2d55b4\x2d42a1\x2d9c9e\x2dac5b4c55e14f.mount: Deactivated successfully.
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.230 183087 INFO nova.compute.manager [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Took 0.36 seconds to destroy the instance on the hypervisor.
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.230 183087 DEBUG oslo.service.loopingcall [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.230 183087 DEBUG nova.compute.manager [-] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 09:10:44 compute-1 nova_compute[183083]: 2026-01-26 09:10:44.231 183087 DEBUG nova.network.neutron [-] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 09:10:45 compute-1 nova_compute[183083]: 2026-01-26 09:10:45.126 183087 DEBUG nova.network.neutron [-] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:10:45 compute-1 nova_compute[183083]: 2026-01-26 09:10:45.142 183087 INFO nova.compute.manager [-] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Took 0.91 seconds to deallocate network for instance.
Jan 26 09:10:45 compute-1 nova_compute[183083]: 2026-01-26 09:10:45.196 183087 DEBUG oslo_concurrency.lockutils [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:10:45 compute-1 nova_compute[183083]: 2026-01-26 09:10:45.197 183087 DEBUG oslo_concurrency.lockutils [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:10:45 compute-1 nova_compute[183083]: 2026-01-26 09:10:45.264 183087 DEBUG nova.compute.provider_tree [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:10:45 compute-1 nova_compute[183083]: 2026-01-26 09:10:45.278 183087 DEBUG nova.scheduler.client.report [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:10:45 compute-1 nova_compute[183083]: 2026-01-26 09:10:45.298 183087 DEBUG oslo_concurrency.lockutils [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:10:45 compute-1 nova_compute[183083]: 2026-01-26 09:10:45.321 183087 INFO nova.scheduler.client.report [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Deleted allocations for instance caf06a69-97a3-459f-80ee-d3e792033a7d
Jan 26 09:10:45 compute-1 nova_compute[183083]: 2026-01-26 09:10:45.390 183087 DEBUG oslo_concurrency.lockutils [None req-bb14573a-0767-4b7e-a44e-84c74f28c00e 90104736f4ab4d81b09d1ff11e40f454 2580bb16c90849c4b5919eb271774a06 - - default default] Lock "caf06a69-97a3-459f-80ee-d3e792033a7d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.519s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:10:45 compute-1 nova_compute[183083]: 2026-01-26 09:10:45.471 183087 DEBUG nova.network.neutron [req-a96b2420-9713-47e5-8b11-4a2ee81f1426 req-10479a4e-1a54-4c99-ac7b-1e388135dbd1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Updated VIF entry in instance network info cache for port 5e7c63f6-ad56-4c99-825c-91547862fe78. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:10:45 compute-1 nova_compute[183083]: 2026-01-26 09:10:45.472 183087 DEBUG nova.network.neutron [req-a96b2420-9713-47e5-8b11-4a2ee81f1426 req-10479a4e-1a54-4c99-ac7b-1e388135dbd1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Updating instance_info_cache with network_info: [{"id": "5e7c63f6-ad56-4c99-825c-91547862fe78", "address": "fa:16:3e:dc:d3:56", "network": {"id": "8c0c7436-55b4-42a1-9c9e-ac5b4c55e14f", "bridge": "br-int", "label": "tempest-test-network--544222820", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21d2dd4efd74429aab05a84f55aaa4f9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7c63f6-ad", "ovs_interfaceid": "5e7c63f6-ad56-4c99-825c-91547862fe78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:10:45 compute-1 nova_compute[183083]: 2026-01-26 09:10:45.493 183087 DEBUG oslo_concurrency.lockutils [req-a96b2420-9713-47e5-8b11-4a2ee81f1426 req-10479a4e-1a54-4c99-ac7b-1e388135dbd1 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-caf06a69-97a3-459f-80ee-d3e792033a7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:10:46 compute-1 nova_compute[183083]: 2026-01-26 09:10:46.045 183087 DEBUG nova.compute.manager [req-88ebec2d-b3f7-4786-9c6c-9eb658dd1be1 req-cbe390bf-6525-4b70-a4d3-40c176842f28 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Received event network-vif-unplugged-5e7c63f6-ad56-4c99-825c-91547862fe78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:10:46 compute-1 nova_compute[183083]: 2026-01-26 09:10:46.046 183087 DEBUG oslo_concurrency.lockutils [req-88ebec2d-b3f7-4786-9c6c-9eb658dd1be1 req-cbe390bf-6525-4b70-a4d3-40c176842f28 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "caf06a69-97a3-459f-80ee-d3e792033a7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:10:46 compute-1 nova_compute[183083]: 2026-01-26 09:10:46.047 183087 DEBUG oslo_concurrency.lockutils [req-88ebec2d-b3f7-4786-9c6c-9eb658dd1be1 req-cbe390bf-6525-4b70-a4d3-40c176842f28 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "caf06a69-97a3-459f-80ee-d3e792033a7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:10:46 compute-1 nova_compute[183083]: 2026-01-26 09:10:46.047 183087 DEBUG oslo_concurrency.lockutils [req-88ebec2d-b3f7-4786-9c6c-9eb658dd1be1 req-cbe390bf-6525-4b70-a4d3-40c176842f28 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "caf06a69-97a3-459f-80ee-d3e792033a7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:10:46 compute-1 nova_compute[183083]: 2026-01-26 09:10:46.048 183087 DEBUG nova.compute.manager [req-88ebec2d-b3f7-4786-9c6c-9eb658dd1be1 req-cbe390bf-6525-4b70-a4d3-40c176842f28 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] No waiting events found dispatching network-vif-unplugged-5e7c63f6-ad56-4c99-825c-91547862fe78 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:10:46 compute-1 nova_compute[183083]: 2026-01-26 09:10:46.049 183087 WARNING nova.compute.manager [req-88ebec2d-b3f7-4786-9c6c-9eb658dd1be1 req-cbe390bf-6525-4b70-a4d3-40c176842f28 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Received unexpected event network-vif-unplugged-5e7c63f6-ad56-4c99-825c-91547862fe78 for instance with vm_state deleted and task_state None.
Jan 26 09:10:46 compute-1 nova_compute[183083]: 2026-01-26 09:10:46.050 183087 DEBUG nova.compute.manager [req-88ebec2d-b3f7-4786-9c6c-9eb658dd1be1 req-cbe390bf-6525-4b70-a4d3-40c176842f28 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Received event network-vif-plugged-5e7c63f6-ad56-4c99-825c-91547862fe78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:10:46 compute-1 nova_compute[183083]: 2026-01-26 09:10:46.051 183087 DEBUG oslo_concurrency.lockutils [req-88ebec2d-b3f7-4786-9c6c-9eb658dd1be1 req-cbe390bf-6525-4b70-a4d3-40c176842f28 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "caf06a69-97a3-459f-80ee-d3e792033a7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:10:46 compute-1 nova_compute[183083]: 2026-01-26 09:10:46.051 183087 DEBUG oslo_concurrency.lockutils [req-88ebec2d-b3f7-4786-9c6c-9eb658dd1be1 req-cbe390bf-6525-4b70-a4d3-40c176842f28 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "caf06a69-97a3-459f-80ee-d3e792033a7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:10:46 compute-1 nova_compute[183083]: 2026-01-26 09:10:46.052 183087 DEBUG oslo_concurrency.lockutils [req-88ebec2d-b3f7-4786-9c6c-9eb658dd1be1 req-cbe390bf-6525-4b70-a4d3-40c176842f28 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "caf06a69-97a3-459f-80ee-d3e792033a7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:10:46 compute-1 nova_compute[183083]: 2026-01-26 09:10:46.053 183087 DEBUG nova.compute.manager [req-88ebec2d-b3f7-4786-9c6c-9eb658dd1be1 req-cbe390bf-6525-4b70-a4d3-40c176842f28 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] No waiting events found dispatching network-vif-plugged-5e7c63f6-ad56-4c99-825c-91547862fe78 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:10:46 compute-1 nova_compute[183083]: 2026-01-26 09:10:46.054 183087 WARNING nova.compute.manager [req-88ebec2d-b3f7-4786-9c6c-9eb658dd1be1 req-cbe390bf-6525-4b70-a4d3-40c176842f28 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Received unexpected event network-vif-plugged-5e7c63f6-ad56-4c99-825c-91547862fe78 for instance with vm_state deleted and task_state None.
Jan 26 09:10:46 compute-1 nova_compute[183083]: 2026-01-26 09:10:46.055 183087 DEBUG nova.compute.manager [req-88ebec2d-b3f7-4786-9c6c-9eb658dd1be1 req-cbe390bf-6525-4b70-a4d3-40c176842f28 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Received event network-vif-deleted-5e7c63f6-ad56-4c99-825c-91547862fe78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:10:48 compute-1 nova_compute[183083]: 2026-01-26 09:10:48.435 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:49 compute-1 nova_compute[183083]: 2026-01-26 09:10:49.190 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:49 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:10:49.553 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:10:53 compute-1 nova_compute[183083]: 2026-01-26 09:10:53.469 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:53 compute-1 nova_compute[183083]: 2026-01-26 09:10:53.887 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:54 compute-1 nova_compute[183083]: 2026-01-26 09:10:54.193 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:58 compute-1 nova_compute[183083]: 2026-01-26 09:10:58.472 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:10:59 compute-1 nova_compute[183083]: 2026-01-26 09:10:59.143 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769418644.1418421, caf06a69-97a3-459f-80ee-d3e792033a7d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:10:59 compute-1 nova_compute[183083]: 2026-01-26 09:10:59.144 183087 INFO nova.compute.manager [-] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] VM Stopped (Lifecycle Event)
Jan 26 09:10:59 compute-1 nova_compute[183083]: 2026-01-26 09:10:59.203 183087 DEBUG nova.compute.manager [None req-a7ab3464-e004-4b53-8f0a-8f6ba7385062 - - - - - -] [instance: caf06a69-97a3-459f-80ee-d3e792033a7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:10:59 compute-1 nova_compute[183083]: 2026-01-26 09:10:59.232 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:00 compute-1 podman[226213]: 2026-01-26 09:11:00.857502889 +0000 UTC m=+0.106190287 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 26 09:11:00 compute-1 podman[226216]: 2026-01-26 09:11:00.862174572 +0000 UTC m=+0.102749290 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 09:11:00 compute-1 podman[226214]: 2026-01-26 09:11:00.871955909 +0000 UTC m=+0.115514332 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, release=1755695350, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter)
Jan 26 09:11:00 compute-1 podman[226215]: 2026-01-26 09:11:00.872408572 +0000 UTC m=+0.112454765 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 26 09:11:00 compute-1 podman[226212]: 2026-01-26 09:11:00.890889737 +0000 UTC m=+0.143272500 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 09:11:01 compute-1 nova_compute[183083]: 2026-01-26 09:11:01.078 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:03 compute-1 nova_compute[183083]: 2026-01-26 09:11:03.475 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:04 compute-1 nova_compute[183083]: 2026-01-26 09:11:04.238 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:11:05.326 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:11:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:11:05.327 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:11:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:11:05.327 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:11:07 compute-1 ovn_controller[95352]: 2026-01-26T09:11:07Z|00329|pinctrl|WARN|Dropped 1339 log messages in last 62 seconds (most recently, 7 seconds ago) due to excessive rate
Jan 26 09:11:07 compute-1 ovn_controller[95352]: 2026-01-26T09:11:07Z|00330|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:11:08 compute-1 nova_compute[183083]: 2026-01-26 09:11:08.478 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:09 compute-1 nova_compute[183083]: 2026-01-26 09:11:09.239 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:13 compute-1 nova_compute[183083]: 2026-01-26 09:11:13.480 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:14 compute-1 nova_compute[183083]: 2026-01-26 09:11:14.242 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:14 compute-1 podman[226312]: 2026-01-26 09:11:14.822689743 +0000 UTC m=+0.077960486 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:11:16 compute-1 sshd-session[226337]: Accepted publickey for zuul from 38.102.83.66 port 38856 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:11:16 compute-1 systemd-logind[788]: New session 118 of user zuul.
Jan 26 09:11:16 compute-1 systemd[1]: Started Session 118 of User zuul.
Jan 26 09:11:16 compute-1 sshd-session[226337]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:11:16 compute-1 sshd-session[226341]: Accepted publickey for zuul from 38.102.83.66 port 38866 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:11:16 compute-1 systemd-logind[788]: New session 119 of user zuul.
Jan 26 09:11:16 compute-1 systemd[1]: Started Session 119 of User zuul.
Jan 26 09:11:16 compute-1 sshd-session[226341]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:11:17 compute-1 sudo[226345]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:11:17 compute-1 sudo[226345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:11:17 compute-1 sudo[226345]: pam_unix(sudo:session): session closed for user root
Jan 26 09:11:17 compute-1 sudo[226370]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni eth1 icmp and ether host fa:16:3e:a0:d9:18 -w /tmp/tmp.ZLLBzK32n8
Jan 26 09:11:17 compute-1 sudo[226370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:11:17 compute-1 sshd-session[226344]: Connection closed by 38.102.83.66 port 38866
Jan 26 09:11:17 compute-1 sshd-session[226341]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:11:17 compute-1 systemd[1]: session-119.scope: Deactivated successfully.
Jan 26 09:11:17 compute-1 systemd-logind[788]: Session 119 logged out. Waiting for processes to exit.
Jan 26 09:11:17 compute-1 systemd-logind[788]: Removed session 119.
Jan 26 09:11:18 compute-1 nova_compute[183083]: 2026-01-26 09:11:18.482 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:19 compute-1 nova_compute[183083]: 2026-01-26 09:11:19.310 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:21 compute-1 nova_compute[183083]: 2026-01-26 09:11:21.690 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:11:21 compute-1 nova_compute[183083]: 2026-01-26 09:11:21.691 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:11:21 compute-1 nova_compute[183083]: 2026-01-26 09:11:21.691 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:11:21 compute-1 nova_compute[183083]: 2026-01-26 09:11:21.710 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:11:21 compute-1 nova_compute[183083]: 2026-01-26 09:11:21.966 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:11:22 compute-1 nova_compute[183083]: 2026-01-26 09:11:22.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:11:23 compute-1 nova_compute[183083]: 2026-01-26 09:11:23.485 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:24 compute-1 nova_compute[183083]: 2026-01-26 09:11:24.313 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:24 compute-1 nova_compute[183083]: 2026-01-26 09:11:24.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:11:24 compute-1 nova_compute[183083]: 2026-01-26 09:11:24.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:11:24 compute-1 nova_compute[183083]: 2026-01-26 09:11:24.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:11:26 compute-1 nova_compute[183083]: 2026-01-26 09:11:26.956 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:11:26 compute-1 nova_compute[183083]: 2026-01-26 09:11:26.957 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:11:28 compute-1 nova_compute[183083]: 2026-01-26 09:11:28.492 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:29 compute-1 nova_compute[183083]: 2026-01-26 09:11:29.316 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:29 compute-1 nova_compute[183083]: 2026-01-26 09:11:29.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:11:29 compute-1 nova_compute[183083]: 2026-01-26 09:11:29.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:11:31 compute-1 podman[226410]: 2026-01-26 09:11:31.8015468 +0000 UTC m=+0.055349293 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 09:11:31 compute-1 podman[226397]: 2026-01-26 09:11:31.813259603 +0000 UTC m=+0.080114367 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 26 09:11:31 compute-1 podman[226399]: 2026-01-26 09:11:31.816429343 +0000 UTC m=+0.070846103 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 26 09:11:31 compute-1 podman[226396]: 2026-01-26 09:11:31.833363284 +0000 UTC m=+0.100893997 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Jan 26 09:11:31 compute-1 podman[226398]: 2026-01-26 09:11:31.841068043 +0000 UTC m=+0.091580172 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 26 09:11:31 compute-1 nova_compute[183083]: 2026-01-26 09:11:31.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:11:31 compute-1 nova_compute[183083]: 2026-01-26 09:11:31.984 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:11:31 compute-1 nova_compute[183083]: 2026-01-26 09:11:31.984 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:11:31 compute-1 nova_compute[183083]: 2026-01-26 09:11:31.985 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:11:31 compute-1 nova_compute[183083]: 2026-01-26 09:11:31.985 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:11:32 compute-1 nova_compute[183083]: 2026-01-26 09:11:32.194 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:11:32 compute-1 nova_compute[183083]: 2026-01-26 09:11:32.196 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13620MB free_disk=113.0838737487793GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:11:32 compute-1 nova_compute[183083]: 2026-01-26 09:11:32.196 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:11:32 compute-1 nova_compute[183083]: 2026-01-26 09:11:32.196 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:11:32 compute-1 nova_compute[183083]: 2026-01-26 09:11:32.269 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:11:32 compute-1 nova_compute[183083]: 2026-01-26 09:11:32.269 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:11:32 compute-1 nova_compute[183083]: 2026-01-26 09:11:32.289 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:11:32 compute-1 nova_compute[183083]: 2026-01-26 09:11:32.307 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:11:32 compute-1 nova_compute[183083]: 2026-01-26 09:11:32.333 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:11:32 compute-1 nova_compute[183083]: 2026-01-26 09:11:32.334 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:11:33 compute-1 nova_compute[183083]: 2026-01-26 09:11:33.517 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:34 compute-1 nova_compute[183083]: 2026-01-26 09:11:34.319 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:38 compute-1 sshd-session[226505]: Accepted publickey for zuul from 38.102.83.66 port 36110 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:11:38 compute-1 systemd-logind[788]: New session 120 of user zuul.
Jan 26 09:11:38 compute-1 systemd[1]: Started Session 120 of User zuul.
Jan 26 09:11:38 compute-1 sshd-session[226505]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:11:38 compute-1 nova_compute[183083]: 2026-01-26 09:11:38.518 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:38 compute-1 sudo[226509]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.ZLLBzK32n8
Jan 26 09:11:38 compute-1 sudo[226509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:11:38 compute-1 sudo[226509]: pam_unix(sudo:session): session closed for user root
Jan 26 09:11:39 compute-1 nova_compute[183083]: 2026-01-26 09:11:39.320 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:43 compute-1 nova_compute[183083]: 2026-01-26 09:11:43.557 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:43 compute-1 sshd-session[226535]: Accepted publickey for zuul from 38.102.83.66 port 47494 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:11:43 compute-1 systemd-logind[788]: New session 121 of user zuul.
Jan 26 09:11:43 compute-1 systemd[1]: Started Session 121 of User zuul.
Jan 26 09:11:43 compute-1 sshd-session[226535]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:11:43 compute-1 sudo[226539]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.ZLLBzK32n8
Jan 26 09:11:43 compute-1 sudo[226539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:11:43 compute-1 sudo[226539]: pam_unix(sudo:session): session closed for user root
Jan 26 09:11:43 compute-1 sshd-session[226538]: Connection closed by 38.102.83.66 port 47494
Jan 26 09:11:43 compute-1 sshd-session[226535]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:11:43 compute-1 systemd[1]: session-121.scope: Deactivated successfully.
Jan 26 09:11:44 compute-1 systemd-logind[788]: Session 121 logged out. Waiting for processes to exit.
Jan 26 09:11:44 compute-1 systemd-logind[788]: Removed session 121.
Jan 26 09:11:44 compute-1 nova_compute[183083]: 2026-01-26 09:11:44.323 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:45 compute-1 ovn_controller[95352]: 2026-01-26T09:11:45Z|00331|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 26 09:11:45 compute-1 podman[226565]: 2026-01-26 09:11:45.816359055 +0000 UTC m=+0.081333891 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 09:11:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:11:46.840 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:11:46 compute-1 nova_compute[183083]: 2026-01-26 09:11:46.841 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:11:46.843 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:11:48 compute-1 nova_compute[183083]: 2026-01-26 09:11:48.589 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:49 compute-1 nova_compute[183083]: 2026-01-26 09:11:49.325 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:49 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:11:49.845 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:11:50 compute-1 nova_compute[183083]: 2026-01-26 09:11:50.851 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:53 compute-1 nova_compute[183083]: 2026-01-26 09:11:53.592 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:54 compute-1 nova_compute[183083]: 2026-01-26 09:11:54.328 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:58 compute-1 nova_compute[183083]: 2026-01-26 09:11:58.047 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:58 compute-1 nova_compute[183083]: 2026-01-26 09:11:58.635 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:11:59 compute-1 nova_compute[183083]: 2026-01-26 09:11:59.330 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:02 compute-1 podman[226593]: 2026-01-26 09:12:02.817119042 +0000 UTC m=+0.064290178 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 09:12:02 compute-1 podman[226592]: 2026-01-26 09:12:02.825379046 +0000 UTC m=+0.078927023 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 26 09:12:02 compute-1 podman[226599]: 2026-01-26 09:12:02.830020898 +0000 UTC m=+0.067473448 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:12:02 compute-1 podman[226591]: 2026-01-26 09:12:02.835503514 +0000 UTC m=+0.092222031 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Jan 26 09:12:02 compute-1 podman[226590]: 2026-01-26 09:12:02.86493423 +0000 UTC m=+0.118247600 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 26 09:12:03 compute-1 nova_compute[183083]: 2026-01-26 09:12:03.640 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:03 compute-1 ovn_controller[95352]: 2026-01-26T09:12:03Z|00332|pinctrl|WARN|Dropped 1237 log messages in last 56 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 26 09:12:03 compute-1 ovn_controller[95352]: 2026-01-26T09:12:03Z|00333|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.746 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:12:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:12:04 compute-1 nova_compute[183083]: 2026-01-26 09:12:04.332 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:12:05.327 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:12:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:12:05.328 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:12:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:12:05.328 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:12:08 compute-1 nova_compute[183083]: 2026-01-26 09:12:08.642 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:08 compute-1 nova_compute[183083]: 2026-01-26 09:12:08.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:12:08 compute-1 nova_compute[183083]: 2026-01-26 09:12:08.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 26 09:12:08 compute-1 nova_compute[183083]: 2026-01-26 09:12:08.969 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 26 09:12:09 compute-1 nova_compute[183083]: 2026-01-26 09:12:09.333 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:13 compute-1 nova_compute[183083]: 2026-01-26 09:12:13.645 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:14 compute-1 nova_compute[183083]: 2026-01-26 09:12:14.335 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:16 compute-1 sudo[225890]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:16 compute-1 podman[226691]: 2026-01-26 09:12:16.821348847 +0000 UTC m=+0.079853459 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:12:18 compute-1 nova_compute[183083]: 2026-01-26 09:12:18.647 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:19 compute-1 nova_compute[183083]: 2026-01-26 09:12:19.337 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:20 compute-1 nova_compute[183083]: 2026-01-26 09:12:20.969 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:12:20 compute-1 nova_compute[183083]: 2026-01-26 09:12:20.970 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:12:20 compute-1 nova_compute[183083]: 2026-01-26 09:12:20.971 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:12:21 compute-1 nova_compute[183083]: 2026-01-26 09:12:21.011 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:12:23 compute-1 nova_compute[183083]: 2026-01-26 09:12:23.651 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:24 compute-1 nova_compute[183083]: 2026-01-26 09:12:24.338 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:24 compute-1 nova_compute[183083]: 2026-01-26 09:12:24.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:12:25 compute-1 nova_compute[183083]: 2026-01-26 09:12:25.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:12:26 compute-1 nova_compute[183083]: 2026-01-26 09:12:26.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:12:26 compute-1 nova_compute[183083]: 2026-01-26 09:12:26.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:12:27 compute-1 nova_compute[183083]: 2026-01-26 09:12:27.948 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:12:27 compute-1 nova_compute[183083]: 2026-01-26 09:12:27.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:12:28 compute-1 nova_compute[183083]: 2026-01-26 09:12:28.652 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:29 compute-1 nova_compute[183083]: 2026-01-26 09:12:29.339 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:29 compute-1 nova_compute[183083]: 2026-01-26 09:12:29.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:12:29 compute-1 nova_compute[183083]: 2026-01-26 09:12:29.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:12:30 compute-1 sshd-session[226715]: Accepted publickey for zuul from 38.102.83.66 port 37448 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:12:30 compute-1 systemd-logind[788]: New session 122 of user zuul.
Jan 26 09:12:30 compute-1 systemd[1]: Started Session 122 of User zuul.
Jan 26 09:12:30 compute-1 sshd-session[226715]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:12:30 compute-1 sshd-session[226719]: Accepted publickey for zuul from 38.102.83.66 port 37464 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:12:30 compute-1 systemd-logind[788]: New session 123 of user zuul.
Jan 26 09:12:30 compute-1 systemd[1]: Started Session 123 of User zuul.
Jan 26 09:12:30 compute-1 sshd-session[226719]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:12:30 compute-1 sudo[226723]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:12:30 compute-1 sudo[226723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:30 compute-1 sudo[226723]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:30 compute-1 sudo[226748]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni eth1 icmp and ether host fa:16:3e:bc:ff:97 -w /tmp/tmp.9ysQTbBUBe
Jan 26 09:12:30 compute-1 sudo[226748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:30 compute-1 sshd-session[226722]: Connection closed by 38.102.83.66 port 37464
Jan 26 09:12:30 compute-1 sshd-session[226719]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:12:30 compute-1 systemd[1]: session-123.scope: Deactivated successfully.
Jan 26 09:12:30 compute-1 systemd-logind[788]: Session 123 logged out. Waiting for processes to exit.
Jan 26 09:12:30 compute-1 systemd-logind[788]: Removed session 123.
Jan 26 09:12:30 compute-1 nova_compute[183083]: 2026-01-26 09:12:30.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:12:30 compute-1 nova_compute[183083]: 2026-01-26 09:12:30.953 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 26 09:12:32 compute-1 nova_compute[183083]: 2026-01-26 09:12:32.976 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:12:33 compute-1 nova_compute[183083]: 2026-01-26 09:12:33.010 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:12:33 compute-1 nova_compute[183083]: 2026-01-26 09:12:33.011 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:12:33 compute-1 nova_compute[183083]: 2026-01-26 09:12:33.012 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:12:33 compute-1 nova_compute[183083]: 2026-01-26 09:12:33.012 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:12:33 compute-1 nova_compute[183083]: 2026-01-26 09:12:33.192 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:12:33 compute-1 nova_compute[183083]: 2026-01-26 09:12:33.193 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13650MB free_disk=113.08358764648438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:12:33 compute-1 nova_compute[183083]: 2026-01-26 09:12:33.193 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:12:33 compute-1 nova_compute[183083]: 2026-01-26 09:12:33.194 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:12:33 compute-1 nova_compute[183083]: 2026-01-26 09:12:33.322 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:12:33 compute-1 nova_compute[183083]: 2026-01-26 09:12:33.323 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:12:33 compute-1 nova_compute[183083]: 2026-01-26 09:12:33.387 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:12:33 compute-1 nova_compute[183083]: 2026-01-26 09:12:33.401 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:12:33 compute-1 nova_compute[183083]: 2026-01-26 09:12:33.403 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:12:33 compute-1 nova_compute[183083]: 2026-01-26 09:12:33.403 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:12:33 compute-1 nova_compute[183083]: 2026-01-26 09:12:33.655 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:33 compute-1 podman[226775]: 2026-01-26 09:12:33.809853006 +0000 UTC m=+0.066552012 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:12:33 compute-1 podman[226776]: 2026-01-26 09:12:33.816789433 +0000 UTC m=+0.067604762 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64)
Jan 26 09:12:33 compute-1 podman[226787]: 2026-01-26 09:12:33.817673678 +0000 UTC m=+0.058499023 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 09:12:33 compute-1 podman[226792]: 2026-01-26 09:12:33.835422382 +0000 UTC m=+0.067939561 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 09:12:33 compute-1 podman[226774]: 2026-01-26 09:12:33.842292387 +0000 UTC m=+0.100070023 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 26 09:12:34 compute-1 nova_compute[183083]: 2026-01-26 09:12:34.342 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:38 compute-1 nova_compute[183083]: 2026-01-26 09:12:38.658 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:39 compute-1 nova_compute[183083]: 2026-01-26 09:12:39.394 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:39 compute-1 sshd-session[226878]: Accepted publickey for zuul from 38.102.83.66 port 44204 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:12:39 compute-1 systemd-logind[788]: New session 124 of user zuul.
Jan 26 09:12:39 compute-1 systemd[1]: Started Session 124 of User zuul.
Jan 26 09:12:39 compute-1 sshd-session[226878]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:12:39 compute-1 sudo[226882]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.9ysQTbBUBe
Jan 26 09:12:39 compute-1 sudo[226882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:39 compute-1 sudo[226882]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:39 compute-1 nova_compute[183083]: 2026-01-26 09:12:39.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:12:43 compute-1 nova_compute[183083]: 2026-01-26 09:12:43.660 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:44 compute-1 nova_compute[183083]: 2026-01-26 09:12:44.396 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:47 compute-1 podman[226908]: 2026-01-26 09:12:47.809247604 +0000 UTC m=+0.077803594 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 09:12:48 compute-1 nova_compute[183083]: 2026-01-26 09:12:48.664 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:49 compute-1 nova_compute[183083]: 2026-01-26 09:12:49.399 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:49 compute-1 sshd-session[226932]: Accepted publickey for zuul from 38.102.83.66 port 32994 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:12:49 compute-1 systemd-logind[788]: New session 125 of user zuul.
Jan 26 09:12:49 compute-1 systemd[1]: Started Session 125 of User zuul.
Jan 26 09:12:49 compute-1 sshd-session[226932]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:12:49 compute-1 sshd-session[226936]: Accepted publickey for zuul from 38.102.83.66 port 33008 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:12:49 compute-1 systemd-logind[788]: New session 126 of user zuul.
Jan 26 09:12:49 compute-1 systemd[1]: Started Session 126 of User zuul.
Jan 26 09:12:49 compute-1 sshd-session[226936]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:12:50 compute-1 sudo[226940]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:12:50 compute-1 sudo[226940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:50 compute-1 sudo[226940]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:50 compute-1 sudo[226965]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni eth1 icmp and ether host fa:16:3e:42:21:6e -w /tmp/tmp.dU6AEbJMkD
Jan 26 09:12:50 compute-1 sudo[226965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:50 compute-1 sshd-session[226939]: Connection closed by 38.102.83.66 port 33008
Jan 26 09:12:50 compute-1 sshd-session[226936]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:12:50 compute-1 systemd[1]: session-126.scope: Deactivated successfully.
Jan 26 09:12:50 compute-1 systemd-logind[788]: Session 126 logged out. Waiting for processes to exit.
Jan 26 09:12:50 compute-1 systemd-logind[788]: Removed session 126.
Jan 26 09:12:53 compute-1 sshd-session[226991]: Invalid user validator from 2.57.122.238 port 48524
Jan 26 09:12:53 compute-1 sshd-session[226991]: Connection closed by invalid user validator 2.57.122.238 port 48524 [preauth]
Jan 26 09:12:53 compute-1 nova_compute[183083]: 2026-01-26 09:12:53.668 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:54 compute-1 nova_compute[183083]: 2026-01-26 09:12:54.402 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:58 compute-1 nova_compute[183083]: 2026-01-26 09:12:58.713 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:12:59 compute-1 sshd-session[226994]: Accepted publickey for zuul from 38.102.83.66 port 49320 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:12:59 compute-1 systemd-logind[788]: New session 127 of user zuul.
Jan 26 09:12:59 compute-1 systemd[1]: Started Session 127 of User zuul.
Jan 26 09:12:59 compute-1 sshd-session[226994]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:12:59 compute-1 sudo[226998]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.dU6AEbJMkD
Jan 26 09:12:59 compute-1 sudo[226998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:12:59 compute-1 sudo[226998]: pam_unix(sudo:session): session closed for user root
Jan 26 09:12:59 compute-1 nova_compute[183083]: 2026-01-26 09:12:59.403 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:03 compute-1 nova_compute[183083]: 2026-01-26 09:13:03.719 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:04 compute-1 nova_compute[183083]: 2026-01-26 09:13:04.407 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:04 compute-1 ovn_controller[95352]: 2026-01-26T09:13:04Z|00334|pinctrl|WARN|Dropped 485 log messages in last 61 seconds (most recently, 6 seconds ago) due to excessive rate
Jan 26 09:13:04 compute-1 ovn_controller[95352]: 2026-01-26T09:13:04Z|00335|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:13:04 compute-1 podman[227026]: 2026-01-26 09:13:04.808657503 +0000 UTC m=+0.074255103 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Jan 26 09:13:04 compute-1 podman[227027]: 2026-01-26 09:13:04.815702702 +0000 UTC m=+0.068505230 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Jan 26 09:13:04 compute-1 podman[227028]: 2026-01-26 09:13:04.818180543 +0000 UTC m=+0.070501448 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 09:13:04 compute-1 podman[227024]: 2026-01-26 09:13:04.825797708 +0000 UTC m=+0.093359494 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 26 09:13:04 compute-1 podman[227025]: 2026-01-26 09:13:04.84705696 +0000 UTC m=+0.114091121 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 09:13:04 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:13:04.862 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:13:04 compute-1 nova_compute[183083]: 2026-01-26 09:13:04.863 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:04 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:13:04.864 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:13:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:13:05.330 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:13:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:13:05.330 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:13:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:13:05.331 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:13:08 compute-1 sshd-session[227127]: Accepted publickey for zuul from 38.102.83.66 port 41572 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:13:08 compute-1 systemd-logind[788]: New session 128 of user zuul.
Jan 26 09:13:08 compute-1 systemd[1]: Started Session 128 of User zuul.
Jan 26 09:13:08 compute-1 sshd-session[227127]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:13:08 compute-1 sshd-session[227131]: Accepted publickey for zuul from 38.102.83.66 port 41576 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:13:08 compute-1 systemd-logind[788]: New session 129 of user zuul.
Jan 26 09:13:08 compute-1 systemd[1]: Started Session 129 of User zuul.
Jan 26 09:13:08 compute-1 sshd-session[227131]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:13:08 compute-1 sudo[227135]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/mktemp
Jan 26 09:13:08 compute-1 nova_compute[183083]: 2026-01-26 09:13:08.720 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:08 compute-1 sudo[227135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:13:08 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 09:13:08 compute-1 sudo[227135]: pam_unix(sudo:session): session closed for user root
Jan 26 09:13:08 compute-1 sudo[227161]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 tcpdump -s0 -Uni eth1 icmp and ether host fa:16:3e:42:21:6e -w /tmp/tmp.VSCKsBhst6
Jan 26 09:13:08 compute-1 sudo[227161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:13:08 compute-1 sshd-session[227134]: Connection closed by 38.102.83.66 port 41576
Jan 26 09:13:08 compute-1 sshd-session[227131]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:13:08 compute-1 systemd[1]: session-129.scope: Deactivated successfully.
Jan 26 09:13:08 compute-1 systemd-logind[788]: Session 129 logged out. Waiting for processes to exit.
Jan 26 09:13:08 compute-1 systemd-logind[788]: Removed session 129.
Jan 26 09:13:09 compute-1 nova_compute[183083]: 2026-01-26 09:13:09.411 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:13:11.866 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:13:12 compute-1 ovn_controller[95352]: 2026-01-26T09:13:12Z|00336|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 26 09:13:13 compute-1 nova_compute[183083]: 2026-01-26 09:13:13.721 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:14 compute-1 nova_compute[183083]: 2026-01-26 09:13:14.414 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:17 compute-1 sudo[226370]: pam_unix(sudo:session): session closed for user root
Jan 26 09:13:18 compute-1 nova_compute[183083]: 2026-01-26 09:13:18.725 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:18 compute-1 podman[227187]: 2026-01-26 09:13:18.823918992 +0000 UTC m=+0.086999444 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 09:13:19 compute-1 nova_compute[183083]: 2026-01-26 09:13:19.417 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:21 compute-1 nova_compute[183083]: 2026-01-26 09:13:21.964 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:13:21 compute-1 nova_compute[183083]: 2026-01-26 09:13:21.964 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:13:21 compute-1 nova_compute[183083]: 2026-01-26 09:13:21.965 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:13:22 compute-1 nova_compute[183083]: 2026-01-26 09:13:22.058 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:13:23 compute-1 nova_compute[183083]: 2026-01-26 09:13:23.725 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:24 compute-1 nova_compute[183083]: 2026-01-26 09:13:24.418 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:24 compute-1 nova_compute[183083]: 2026-01-26 09:13:24.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:13:24 compute-1 nova_compute[183083]: 2026-01-26 09:13:24.966 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:13:26 compute-1 nova_compute[183083]: 2026-01-26 09:13:26.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:13:26 compute-1 nova_compute[183083]: 2026-01-26 09:13:26.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:13:27 compute-1 nova_compute[183083]: 2026-01-26 09:13:27.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:13:28 compute-1 nova_compute[183083]: 2026-01-26 09:13:28.727 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:28 compute-1 nova_compute[183083]: 2026-01-26 09:13:28.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:13:29 compute-1 nova_compute[183083]: 2026-01-26 09:13:29.420 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:29 compute-1 nova_compute[183083]: 2026-01-26 09:13:29.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:13:30 compute-1 sshd-session[227211]: Accepted publickey for zuul from 38.102.83.66 port 39738 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:13:30 compute-1 systemd-logind[788]: New session 130 of user zuul.
Jan 26 09:13:30 compute-1 systemd[1]: Started Session 130 of User zuul.
Jan 26 09:13:30 compute-1 sshd-session[227211]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:13:30 compute-1 sudo[227215]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 cat /tmp/tmp.VSCKsBhst6
Jan 26 09:13:30 compute-1 sudo[227215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:13:30 compute-1 sudo[227215]: pam_unix(sudo:session): session closed for user root
Jan 26 09:13:31 compute-1 nova_compute[183083]: 2026-01-26 09:13:31.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:13:31 compute-1 nova_compute[183083]: 2026-01-26 09:13:31.953 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:13:33 compute-1 nova_compute[183083]: 2026-01-26 09:13:33.729 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:34 compute-1 nova_compute[183083]: 2026-01-26 09:13:34.421 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:34 compute-1 nova_compute[183083]: 2026-01-26 09:13:34.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:13:35 compute-1 nova_compute[183083]: 2026-01-26 09:13:35.004 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:13:35 compute-1 nova_compute[183083]: 2026-01-26 09:13:35.004 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:13:35 compute-1 nova_compute[183083]: 2026-01-26 09:13:35.004 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:13:35 compute-1 nova_compute[183083]: 2026-01-26 09:13:35.004 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:13:35 compute-1 nova_compute[183083]: 2026-01-26 09:13:35.166 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:13:35 compute-1 nova_compute[183083]: 2026-01-26 09:13:35.167 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13623MB free_disk=113.08360290527344GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:13:35 compute-1 nova_compute[183083]: 2026-01-26 09:13:35.167 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:13:35 compute-1 nova_compute[183083]: 2026-01-26 09:13:35.167 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:13:35 compute-1 nova_compute[183083]: 2026-01-26 09:13:35.332 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:13:35 compute-1 nova_compute[183083]: 2026-01-26 09:13:35.333 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:13:35 compute-1 nova_compute[183083]: 2026-01-26 09:13:35.367 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:13:35 compute-1 nova_compute[183083]: 2026-01-26 09:13:35.395 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:13:35 compute-1 nova_compute[183083]: 2026-01-26 09:13:35.397 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:13:35 compute-1 nova_compute[183083]: 2026-01-26 09:13:35.397 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:13:35 compute-1 sshd-session[227241]: Accepted publickey for zuul from 38.102.83.66 port 38240 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:13:35 compute-1 systemd-logind[788]: New session 131 of user zuul.
Jan 26 09:13:35 compute-1 systemd[1]: Started Session 131 of User zuul.
Jan 26 09:13:35 compute-1 sshd-session[227241]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:13:35 compute-1 podman[227245]: 2026-01-26 09:13:35.740198125 +0000 UTC m=+0.085773829 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 26 09:13:35 compute-1 podman[227260]: 2026-01-26 09:13:35.753217194 +0000 UTC m=+0.073651086 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 09:13:35 compute-1 podman[227243]: 2026-01-26 09:13:35.777845691 +0000 UTC m=+0.122931981 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:13:35 compute-1 podman[227246]: 2026-01-26 09:13:35.777948234 +0000 UTC m=+0.117648692 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.openshift.expose-services=)
Jan 26 09:13:35 compute-1 podman[227247]: 2026-01-26 09:13:35.777985865 +0000 UTC m=+0.102249996 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 26 09:13:35 compute-1 sudo[227337]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.VSCKsBhst6
Jan 26 09:13:35 compute-1 sudo[227337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:13:35 compute-1 sudo[227337]: pam_unix(sudo:session): session closed for user root
Jan 26 09:13:35 compute-1 sshd-session[227291]: Connection closed by 38.102.83.66 port 38240
Jan 26 09:13:35 compute-1 sshd-session[227241]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:13:35 compute-1 systemd[1]: session-131.scope: Deactivated successfully.
Jan 26 09:13:35 compute-1 systemd-logind[788]: Session 131 logged out. Waiting for processes to exit.
Jan 26 09:13:35 compute-1 systemd-logind[788]: Removed session 131.
Jan 26 09:13:38 compute-1 nova_compute[183083]: 2026-01-26 09:13:38.733 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:39 compute-1 nova_compute[183083]: 2026-01-26 09:13:39.423 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:42 compute-1 sshd-session[227377]: Accepted publickey for zuul from 38.102.83.66 port 38254 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:13:42 compute-1 systemd-logind[788]: New session 132 of user zuul.
Jan 26 09:13:42 compute-1 systemd[1]: Started Session 132 of User zuul.
Jan 26 09:13:42 compute-1 sshd-session[227377]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:13:43 compute-1 sudo[227381]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.dU6AEbJMkD
Jan 26 09:13:43 compute-1 sudo[227381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:13:43 compute-1 sudo[227381]: pam_unix(sudo:session): session closed for user root
Jan 26 09:13:43 compute-1 sshd-session[227380]: Connection closed by 38.102.83.66 port 38254
Jan 26 09:13:43 compute-1 sshd-session[227377]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:13:43 compute-1 systemd[1]: session-132.scope: Deactivated successfully.
Jan 26 09:13:43 compute-1 systemd-logind[788]: Session 132 logged out. Waiting for processes to exit.
Jan 26 09:13:43 compute-1 systemd-logind[788]: Removed session 132.
Jan 26 09:13:43 compute-1 nova_compute[183083]: 2026-01-26 09:13:43.737 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:44 compute-1 nova_compute[183083]: 2026-01-26 09:13:44.428 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:48 compute-1 nova_compute[183083]: 2026-01-26 09:13:48.738 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:49 compute-1 nova_compute[183083]: 2026-01-26 09:13:49.430 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:49 compute-1 podman[227408]: 2026-01-26 09:13:49.803705969 +0000 UTC m=+0.068211352 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 09:13:50 compute-1 sshd-session[227431]: Accepted publickey for zuul from 38.102.83.66 port 35150 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:13:50 compute-1 systemd-logind[788]: New session 133 of user zuul.
Jan 26 09:13:50 compute-1 systemd[1]: Started Session 133 of User zuul.
Jan 26 09:13:50 compute-1 sshd-session[227431]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:13:50 compute-1 sudo[227435]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/timeout 120 rm -f /tmp/tmp.9ysQTbBUBe
Jan 26 09:13:50 compute-1 sudo[227435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:13:50 compute-1 sudo[227435]: pam_unix(sudo:session): session closed for user root
Jan 26 09:13:50 compute-1 sshd-session[227434]: Connection closed by 38.102.83.66 port 35150
Jan 26 09:13:50 compute-1 sshd-session[227431]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:13:50 compute-1 systemd[1]: session-133.scope: Deactivated successfully.
Jan 26 09:13:50 compute-1 systemd-logind[788]: Session 133 logged out. Waiting for processes to exit.
Jan 26 09:13:50 compute-1 systemd-logind[788]: Removed session 133.
Jan 26 09:13:53 compute-1 nova_compute[183083]: 2026-01-26 09:13:53.739 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:54 compute-1 nova_compute[183083]: 2026-01-26 09:13:54.431 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:56 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:13:56.444 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:13:56 compute-1 nova_compute[183083]: 2026-01-26 09:13:56.444 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:56 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:13:56.447 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:13:58 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:13:58.450 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:13:58 compute-1 nova_compute[183083]: 2026-01-26 09:13:58.741 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:13:59 compute-1 nova_compute[183083]: 2026-01-26 09:13:59.432 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:03 compute-1 nova_compute[183083]: 2026-01-26 09:14:03.745 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:14:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:14:04 compute-1 nova_compute[183083]: 2026-01-26 09:14:04.435 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:14:05.331 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:14:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:14:05.332 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:14:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:14:05.332 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:14:06 compute-1 ovn_controller[95352]: 2026-01-26T09:14:06Z|00337|pinctrl|WARN|Dropped 701 log messages in last 62 seconds (most recently, 5 seconds ago) due to excessive rate
Jan 26 09:14:06 compute-1 ovn_controller[95352]: 2026-01-26T09:14:06Z|00338|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:14:06 compute-1 podman[227471]: 2026-01-26 09:14:06.84693791 +0000 UTC m=+0.089459144 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:14:06 compute-1 podman[227464]: 2026-01-26 09:14:06.85223609 +0000 UTC m=+0.099346424 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 09:14:06 compute-1 podman[227468]: 2026-01-26 09:14:06.852980981 +0000 UTC m=+0.067251714 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 26 09:14:06 compute-1 podman[227463]: 2026-01-26 09:14:06.924824548 +0000 UTC m=+0.175949875 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 26 09:14:06 compute-1 podman[227462]: 2026-01-26 09:14:06.967948137 +0000 UTC m=+0.228291575 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 26 09:14:08 compute-1 nova_compute[183083]: 2026-01-26 09:14:08.746 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:09 compute-1 nova_compute[183083]: 2026-01-26 09:14:09.436 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:09 compute-1 nova_compute[183083]: 2026-01-26 09:14:09.540 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:09 compute-1 nova_compute[183083]: 2026-01-26 09:14:09.848 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:13 compute-1 nova_compute[183083]: 2026-01-26 09:14:13.791 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:14 compute-1 nova_compute[183083]: 2026-01-26 09:14:14.438 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:18 compute-1 nova_compute[183083]: 2026-01-26 09:14:18.793 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:19 compute-1 nova_compute[183083]: 2026-01-26 09:14:19.440 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:20 compute-1 podman[227568]: 2026-01-26 09:14:20.804158344 +0000 UTC m=+0.066459212 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 09:14:22 compute-1 nova_compute[183083]: 2026-01-26 09:14:22.397 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:14:22 compute-1 nova_compute[183083]: 2026-01-26 09:14:22.398 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:14:22 compute-1 nova_compute[183083]: 2026-01-26 09:14:22.398 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:14:22 compute-1 nova_compute[183083]: 2026-01-26 09:14:22.432 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:14:23 compute-1 nova_compute[183083]: 2026-01-26 09:14:23.795 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:24 compute-1 nova_compute[183083]: 2026-01-26 09:14:24.442 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:25 compute-1 nova_compute[183083]: 2026-01-26 09:14:25.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:14:26 compute-1 nova_compute[183083]: 2026-01-26 09:14:26.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:14:27 compute-1 nova_compute[183083]: 2026-01-26 09:14:27.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:14:28 compute-1 nova_compute[183083]: 2026-01-26 09:14:28.797 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:28 compute-1 nova_compute[183083]: 2026-01-26 09:14:28.946 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:14:28 compute-1 nova_compute[183083]: 2026-01-26 09:14:28.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:14:29 compute-1 nova_compute[183083]: 2026-01-26 09:14:29.444 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:29 compute-1 nova_compute[183083]: 2026-01-26 09:14:29.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:14:30 compute-1 sudo[226748]: pam_unix(sudo:session): session closed for user root
Jan 26 09:14:33 compute-1 nova_compute[183083]: 2026-01-26 09:14:33.800 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:33 compute-1 nova_compute[183083]: 2026-01-26 09:14:33.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:14:33 compute-1 nova_compute[183083]: 2026-01-26 09:14:33.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:14:34 compute-1 nova_compute[183083]: 2026-01-26 09:14:34.446 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:36 compute-1 nova_compute[183083]: 2026-01-26 09:14:36.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:14:37 compute-1 nova_compute[183083]: 2026-01-26 09:14:37.078 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:14:37 compute-1 nova_compute[183083]: 2026-01-26 09:14:37.078 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:14:37 compute-1 nova_compute[183083]: 2026-01-26 09:14:37.078 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:14:37 compute-1 nova_compute[183083]: 2026-01-26 09:14:37.079 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:14:37 compute-1 nova_compute[183083]: 2026-01-26 09:14:37.296 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:14:37 compute-1 nova_compute[183083]: 2026-01-26 09:14:37.296 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13640MB free_disk=113.0836067199707GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:14:37 compute-1 nova_compute[183083]: 2026-01-26 09:14:37.297 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:14:37 compute-1 nova_compute[183083]: 2026-01-26 09:14:37.297 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:14:37 compute-1 nova_compute[183083]: 2026-01-26 09:14:37.429 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:14:37 compute-1 nova_compute[183083]: 2026-01-26 09:14:37.430 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:14:37 compute-1 nova_compute[183083]: 2026-01-26 09:14:37.496 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:14:37 compute-1 nova_compute[183083]: 2026-01-26 09:14:37.516 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:14:37 compute-1 nova_compute[183083]: 2026-01-26 09:14:37.517 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:14:37 compute-1 nova_compute[183083]: 2026-01-26 09:14:37.518 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:14:37 compute-1 podman[227594]: 2026-01-26 09:14:37.835982551 +0000 UTC m=+0.081520869 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 26 09:14:37 compute-1 podman[227595]: 2026-01-26 09:14:37.838532844 +0000 UTC m=+0.073470832 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 09:14:37 compute-1 podman[227592]: 2026-01-26 09:14:37.847750245 +0000 UTC m=+0.103053159 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 09:14:37 compute-1 podman[227601]: 2026-01-26 09:14:37.868124802 +0000 UTC m=+0.098779338 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:14:37 compute-1 podman[227593]: 2026-01-26 09:14:37.868265886 +0000 UTC m=+0.116231972 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 26 09:14:38 compute-1 nova_compute[183083]: 2026-01-26 09:14:38.844 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:39 compute-1 nova_compute[183083]: 2026-01-26 09:14:39.448 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:43 compute-1 nova_compute[183083]: 2026-01-26 09:14:43.846 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:44 compute-1 nova_compute[183083]: 2026-01-26 09:14:44.450 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:48 compute-1 nova_compute[183083]: 2026-01-26 09:14:48.848 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:50 compute-1 nova_compute[183083]: 2026-01-26 09:14:50.276 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:50 compute-1 sudo[226965]: pam_unix(sudo:session): session closed for user root
Jan 26 09:14:51 compute-1 podman[227699]: 2026-01-26 09:14:51.802879108 +0000 UTC m=+0.061088781 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 09:14:53 compute-1 nova_compute[183083]: 2026-01-26 09:14:53.850 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:55 compute-1 nova_compute[183083]: 2026-01-26 09:14:55.319 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:57 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:14:57.002 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:14:57 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:14:57.003 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:14:57 compute-1 nova_compute[183083]: 2026-01-26 09:14:57.003 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:14:58 compute-1 nova_compute[183083]: 2026-01-26 09:14:58.851 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:00 compute-1 nova_compute[183083]: 2026-01-26 09:15:00.321 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:01 compute-1 sshd-session[227725]: Invalid user sol from 2.57.122.238 port 44808
Jan 26 09:15:01 compute-1 sshd-session[227725]: Connection closed by invalid user sol 2.57.122.238 port 44808 [preauth]
Jan 26 09:15:03 compute-1 nova_compute[183083]: 2026-01-26 09:15:03.854 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:15:05.333 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:15:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:15:05.334 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:15:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:15:05.335 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:15:05 compute-1 nova_compute[183083]: 2026-01-26 09:15:05.351 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:15:06.004 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:15:06 compute-1 ovn_controller[95352]: 2026-01-26T09:15:06Z|00339|pinctrl|WARN|Dropped 937 log messages in last 59 seconds (most recently, 4 seconds ago) due to excessive rate
Jan 26 09:15:06 compute-1 ovn_controller[95352]: 2026-01-26T09:15:06Z|00340|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:15:08 compute-1 sudo[227161]: pam_unix(sudo:session): session closed for user root
Jan 26 09:15:08 compute-1 podman[227730]: 2026-01-26 09:15:08.853971887 +0000 UTC m=+0.081853329 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 26 09:15:08 compute-1 nova_compute[183083]: 2026-01-26 09:15:08.889 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:08 compute-1 podman[227729]: 2026-01-26 09:15:08.891002097 +0000 UTC m=+0.133077061 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.buildah.version=1.33.7)
Jan 26 09:15:08 compute-1 podman[227736]: 2026-01-26 09:15:08.902307196 +0000 UTC m=+0.121607985 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 09:15:08 compute-1 podman[227727]: 2026-01-26 09:15:08.913140132 +0000 UTC m=+0.162124122 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 09:15:08 compute-1 podman[227728]: 2026-01-26 09:15:08.916091716 +0000 UTC m=+0.156871503 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:15:10 compute-1 nova_compute[183083]: 2026-01-26 09:15:10.354 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:12 compute-1 sshd-session[227827]: Accepted publickey for zuul from 38.102.83.66 port 42734 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:15:12 compute-1 systemd-logind[788]: New session 134 of user zuul.
Jan 26 09:15:12 compute-1 systemd[1]: Started Session 134 of User zuul.
Jan 26 09:15:12 compute-1 sshd-session[227827]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:15:12 compute-1 sshd-session[227830]: Connection closed by 38.102.83.66 port 42734
Jan 26 09:15:12 compute-1 sshd-session[227827]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:15:12 compute-1 systemd[1]: session-134.scope: Deactivated successfully.
Jan 26 09:15:12 compute-1 systemd-logind[788]: Session 134 logged out. Waiting for processes to exit.
Jan 26 09:15:12 compute-1 systemd-logind[788]: Removed session 134.
Jan 26 09:15:13 compute-1 nova_compute[183083]: 2026-01-26 09:15:13.892 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:15 compute-1 nova_compute[183083]: 2026-01-26 09:15:15.356 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:18 compute-1 nova_compute[183083]: 2026-01-26 09:15:18.895 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:20 compute-1 nova_compute[183083]: 2026-01-26 09:15:20.358 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:22 compute-1 podman[227854]: 2026-01-26 09:15:22.847161198 +0000 UTC m=+0.090261837 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 09:15:23 compute-1 nova_compute[183083]: 2026-01-26 09:15:23.898 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:24 compute-1 nova_compute[183083]: 2026-01-26 09:15:24.517 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:15:24 compute-1 nova_compute[183083]: 2026-01-26 09:15:24.518 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:15:24 compute-1 nova_compute[183083]: 2026-01-26 09:15:24.518 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:15:24 compute-1 nova_compute[183083]: 2026-01-26 09:15:24.533 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:15:25 compute-1 nova_compute[183083]: 2026-01-26 09:15:25.362 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:25 compute-1 nova_compute[183083]: 2026-01-26 09:15:25.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:15:27 compute-1 nova_compute[183083]: 2026-01-26 09:15:27.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:15:28 compute-1 nova_compute[183083]: 2026-01-26 09:15:28.899 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:28 compute-1 nova_compute[183083]: 2026-01-26 09:15:28.946 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:15:28 compute-1 nova_compute[183083]: 2026-01-26 09:15:28.962 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:15:28 compute-1 nova_compute[183083]: 2026-01-26 09:15:28.963 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:15:30 compute-1 nova_compute[183083]: 2026-01-26 09:15:30.364 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:30 compute-1 nova_compute[183083]: 2026-01-26 09:15:30.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:15:30 compute-1 nova_compute[183083]: 2026-01-26 09:15:30.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:15:33 compute-1 nova_compute[183083]: 2026-01-26 09:15:33.902 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:33 compute-1 nova_compute[183083]: 2026-01-26 09:15:33.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:15:33 compute-1 nova_compute[183083]: 2026-01-26 09:15:33.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:15:35 compute-1 nova_compute[183083]: 2026-01-26 09:15:35.367 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:38 compute-1 nova_compute[183083]: 2026-01-26 09:15:38.903 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:38 compute-1 nova_compute[183083]: 2026-01-26 09:15:38.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:15:38 compute-1 nova_compute[183083]: 2026-01-26 09:15:38.973 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:15:38 compute-1 nova_compute[183083]: 2026-01-26 09:15:38.973 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:15:38 compute-1 nova_compute[183083]: 2026-01-26 09:15:38.973 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:15:38 compute-1 nova_compute[183083]: 2026-01-26 09:15:38.973 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:15:39 compute-1 nova_compute[183083]: 2026-01-26 09:15:39.170 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:15:39 compute-1 nova_compute[183083]: 2026-01-26 09:15:39.171 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13653MB free_disk=113.08361434936523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:15:39 compute-1 nova_compute[183083]: 2026-01-26 09:15:39.172 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:15:39 compute-1 nova_compute[183083]: 2026-01-26 09:15:39.172 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:15:39 compute-1 nova_compute[183083]: 2026-01-26 09:15:39.231 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:15:39 compute-1 nova_compute[183083]: 2026-01-26 09:15:39.231 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:15:39 compute-1 nova_compute[183083]: 2026-01-26 09:15:39.244 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing inventories for resource provider 5203935e-446c-4e03-93fa-4c60d651e045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 26 09:15:39 compute-1 nova_compute[183083]: 2026-01-26 09:15:39.257 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating ProviderTree inventory for provider 5203935e-446c-4e03-93fa-4c60d651e045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 26 09:15:39 compute-1 nova_compute[183083]: 2026-01-26 09:15:39.258 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating inventory in ProviderTree for provider 5203935e-446c-4e03-93fa-4c60d651e045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 09:15:39 compute-1 nova_compute[183083]: 2026-01-26 09:15:39.271 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing aggregate associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 26 09:15:39 compute-1 nova_compute[183083]: 2026-01-26 09:15:39.289 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing trait associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 26 09:15:39 compute-1 nova_compute[183083]: 2026-01-26 09:15:39.307 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:15:39 compute-1 nova_compute[183083]: 2026-01-26 09:15:39.319 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:15:39 compute-1 nova_compute[183083]: 2026-01-26 09:15:39.321 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:15:39 compute-1 nova_compute[183083]: 2026-01-26 09:15:39.321 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:15:39 compute-1 podman[227882]: 2026-01-26 09:15:39.822218487 +0000 UTC m=+0.061225525 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 09:15:39 compute-1 podman[227880]: 2026-01-26 09:15:39.835846183 +0000 UTC m=+0.087013725 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=openstack_network_exporter)
Jan 26 09:15:39 compute-1 podman[227878]: 2026-01-26 09:15:39.857025342 +0000 UTC m=+0.113567566 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 09:15:39 compute-1 podman[227879]: 2026-01-26 09:15:39.863754993 +0000 UTC m=+0.115369788 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:15:39 compute-1 podman[227881]: 2026-01-26 09:15:39.876829313 +0000 UTC m=+0.112939979 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 26 09:15:40 compute-1 nova_compute[183083]: 2026-01-26 09:15:40.368 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:43 compute-1 nova_compute[183083]: 2026-01-26 09:15:43.905 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:45 compute-1 nova_compute[183083]: 2026-01-26 09:15:45.370 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:48 compute-1 nova_compute[183083]: 2026-01-26 09:15:48.908 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:50 compute-1 nova_compute[183083]: 2026-01-26 09:15:50.373 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:53 compute-1 podman[227980]: 2026-01-26 09:15:53.78434275 +0000 UTC m=+0.053258368 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 09:15:53 compute-1 nova_compute[183083]: 2026-01-26 09:15:53.909 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:55 compute-1 nova_compute[183083]: 2026-01-26 09:15:55.376 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:58 compute-1 nova_compute[183083]: 2026-01-26 09:15:58.912 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:15:59 compute-1 ovn_controller[95352]: 2026-01-26T09:15:59Z|00341|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 26 09:16:00 compute-1 nova_compute[183083]: 2026-01-26 09:16:00.379 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:16:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:16:03 compute-1 nova_compute[183083]: 2026-01-26 09:16:03.913 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:04 compute-1 ovn_controller[95352]: 2026-01-26T09:16:04Z|00342|pinctrl|WARN|Dropped 215 log messages in last 59 seconds (most recently, 5 seconds ago) due to excessive rate
Jan 26 09:16:04 compute-1 ovn_controller[95352]: 2026-01-26T09:16:04Z|00343|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:16:04 compute-1 nova_compute[183083]: 2026-01-26 09:16:04.376 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:04 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:16:04.376 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:16:04 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:16:04.378 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:16:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:16:05.334 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:16:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:16:05.335 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:16:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:16:05.335 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:16:05 compute-1 nova_compute[183083]: 2026-01-26 09:16:05.382 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:08 compute-1 nova_compute[183083]: 2026-01-26 09:16:08.913 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:09 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:16:09.380 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:16:10 compute-1 nova_compute[183083]: 2026-01-26 09:16:10.384 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:10 compute-1 sshd-session[228006]: Accepted publickey for zuul from 38.102.83.66 port 42878 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:16:10 compute-1 systemd-logind[788]: New session 135 of user zuul.
Jan 26 09:16:10 compute-1 systemd[1]: Started Session 135 of User zuul.
Jan 26 09:16:10 compute-1 sshd-session[228006]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:16:10 compute-1 podman[228010]: 2026-01-26 09:16:10.801254534 +0000 UTC m=+0.069711254 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 26 09:16:10 compute-1 podman[228012]: 2026-01-26 09:16:10.801832211 +0000 UTC m=+0.063174060 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 09:16:10 compute-1 podman[228018]: 2026-01-26 09:16:10.802899941 +0000 UTC m=+0.060024111 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:16:10 compute-1 podman[228011]: 2026-01-26 09:16:10.81275585 +0000 UTC m=+0.077119545 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, release=1755695350, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 09:16:10 compute-1 podman[228009]: 2026-01-26 09:16:10.863688202 +0000 UTC m=+0.136719602 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Jan 26 09:16:10 compute-1 sshd-session[228058]: Connection closed by 38.102.83.66 port 42878
Jan 26 09:16:10 compute-1 sshd-session[228006]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:16:10 compute-1 systemd-logind[788]: Session 135 logged out. Waiting for processes to exit.
Jan 26 09:16:10 compute-1 systemd[1]: session-135.scope: Deactivated successfully.
Jan 26 09:16:10 compute-1 systemd-logind[788]: Removed session 135.
Jan 26 09:16:13 compute-1 nova_compute[183083]: 2026-01-26 09:16:13.915 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:15 compute-1 nova_compute[183083]: 2026-01-26 09:16:15.386 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:18 compute-1 nova_compute[183083]: 2026-01-26 09:16:18.917 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:20 compute-1 nova_compute[183083]: 2026-01-26 09:16:20.447 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:23 compute-1 nova_compute[183083]: 2026-01-26 09:16:23.919 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:24 compute-1 podman[228134]: 2026-01-26 09:16:24.78075653 +0000 UTC m=+0.052259001 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:16:25 compute-1 nova_compute[183083]: 2026-01-26 09:16:25.482 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:26 compute-1 nova_compute[183083]: 2026-01-26 09:16:26.322 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:16:26 compute-1 nova_compute[183083]: 2026-01-26 09:16:26.323 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:16:26 compute-1 nova_compute[183083]: 2026-01-26 09:16:26.323 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:16:26 compute-1 nova_compute[183083]: 2026-01-26 09:16:26.337 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:16:26 compute-1 nova_compute[183083]: 2026-01-26 09:16:26.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:16:28 compute-1 nova_compute[183083]: 2026-01-26 09:16:28.920 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:28 compute-1 nova_compute[183083]: 2026-01-26 09:16:28.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:16:28 compute-1 nova_compute[183083]: 2026-01-26 09:16:28.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:16:30 compute-1 nova_compute[183083]: 2026-01-26 09:16:30.483 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:30 compute-1 nova_compute[183083]: 2026-01-26 09:16:30.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:16:30 compute-1 nova_compute[183083]: 2026-01-26 09:16:30.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:16:31 compute-1 nova_compute[183083]: 2026-01-26 09:16:31.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:16:33 compute-1 nova_compute[183083]: 2026-01-26 09:16:33.923 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:34 compute-1 nova_compute[183083]: 2026-01-26 09:16:34.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:16:34 compute-1 nova_compute[183083]: 2026-01-26 09:16:34.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:16:35 compute-1 nova_compute[183083]: 2026-01-26 09:16:35.486 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:38 compute-1 nova_compute[183083]: 2026-01-26 09:16:38.924 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:39 compute-1 nova_compute[183083]: 2026-01-26 09:16:39.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:16:40 compute-1 nova_compute[183083]: 2026-01-26 09:16:40.088 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:16:40 compute-1 nova_compute[183083]: 2026-01-26 09:16:40.088 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:16:40 compute-1 nova_compute[183083]: 2026-01-26 09:16:40.089 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:16:40 compute-1 nova_compute[183083]: 2026-01-26 09:16:40.089 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:16:40 compute-1 nova_compute[183083]: 2026-01-26 09:16:40.299 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:16:40 compute-1 nova_compute[183083]: 2026-01-26 09:16:40.300 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13641MB free_disk=113.08361434936523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:16:40 compute-1 nova_compute[183083]: 2026-01-26 09:16:40.300 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:16:40 compute-1 nova_compute[183083]: 2026-01-26 09:16:40.300 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:16:40 compute-1 nova_compute[183083]: 2026-01-26 09:16:40.432 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:16:40 compute-1 nova_compute[183083]: 2026-01-26 09:16:40.433 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:16:40 compute-1 nova_compute[183083]: 2026-01-26 09:16:40.454 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:16:40 compute-1 nova_compute[183083]: 2026-01-26 09:16:40.468 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:16:40 compute-1 nova_compute[183083]: 2026-01-26 09:16:40.470 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:16:40 compute-1 nova_compute[183083]: 2026-01-26 09:16:40.471 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:16:40 compute-1 nova_compute[183083]: 2026-01-26 09:16:40.489 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:41 compute-1 podman[228162]: 2026-01-26 09:16:41.815841077 +0000 UTC m=+0.065849715 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:16:41 compute-1 podman[228159]: 2026-01-26 09:16:41.816854786 +0000 UTC m=+0.077114554 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 26 09:16:41 compute-1 podman[228160]: 2026-01-26 09:16:41.844942781 +0000 UTC m=+0.103715467 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, architecture=x86_64, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 09:16:41 compute-1 podman[228161]: 2026-01-26 09:16:41.854841842 +0000 UTC m=+0.101248858 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 26 09:16:41 compute-1 podman[228158]: 2026-01-26 09:16:41.854830911 +0000 UTC m=+0.114188154 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 09:16:42 compute-1 sshd-session[228258]: Accepted publickey for zuul from 38.102.83.66 port 57756 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:16:42 compute-1 systemd-logind[788]: New session 136 of user zuul.
Jan 26 09:16:42 compute-1 systemd[1]: Started Session 136 of User zuul.
Jan 26 09:16:42 compute-1 sshd-session[228258]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:16:43 compute-1 sshd-session[228261]: Connection closed by 38.102.83.66 port 57756
Jan 26 09:16:43 compute-1 sshd-session[228258]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:16:43 compute-1 systemd-logind[788]: Session 136 logged out. Waiting for processes to exit.
Jan 26 09:16:43 compute-1 systemd[1]: session-136.scope: Deactivated successfully.
Jan 26 09:16:43 compute-1 systemd-logind[788]: Removed session 136.
Jan 26 09:16:43 compute-1 nova_compute[183083]: 2026-01-26 09:16:43.927 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:45 compute-1 nova_compute[183083]: 2026-01-26 09:16:45.492 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:48 compute-1 nova_compute[183083]: 2026-01-26 09:16:48.928 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:50 compute-1 nova_compute[183083]: 2026-01-26 09:16:50.495 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:50 compute-1 nova_compute[183083]: 2026-01-26 09:16:50.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:16:50 compute-1 nova_compute[183083]: 2026-01-26 09:16:50.951 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:16:50 compute-1 nova_compute[183083]: 2026-01-26 09:16:50.952 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:16:50 compute-1 nova_compute[183083]: 2026-01-26 09:16:50.953 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:16:50 compute-1 nova_compute[183083]: 2026-01-26 09:16:50.953 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:16:50 compute-1 nova_compute[183083]: 2026-01-26 09:16:50.954 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:16:50 compute-1 nova_compute[183083]: 2026-01-26 09:16:50.954 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:16:51 compute-1 nova_compute[183083]: 2026-01-26 09:16:51.704 183087 DEBUG nova.virt.libvirt.imagecache [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Jan 26 09:16:51 compute-1 nova_compute[183083]: 2026-01-26 09:16:51.704 183087 WARNING nova.virt.libvirt.imagecache [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Unknown base file: /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52
Jan 26 09:16:51 compute-1 nova_compute[183083]: 2026-01-26 09:16:51.704 183087 INFO nova.virt.libvirt.imagecache [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Removable base files: /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52
Jan 26 09:16:51 compute-1 nova_compute[183083]: 2026-01-26 09:16:51.705 183087 INFO nova.virt.libvirt.imagecache [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52
Jan 26 09:16:51 compute-1 nova_compute[183083]: 2026-01-26 09:16:51.705 183087 DEBUG nova.virt.libvirt.imagecache [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Jan 26 09:16:51 compute-1 nova_compute[183083]: 2026-01-26 09:16:51.706 183087 DEBUG nova.virt.libvirt.imagecache [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Jan 26 09:16:51 compute-1 nova_compute[183083]: 2026-01-26 09:16:51.706 183087 DEBUG nova.virt.libvirt.imagecache [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Jan 26 09:16:53 compute-1 nova_compute[183083]: 2026-01-26 09:16:53.930 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:55 compute-1 nova_compute[183083]: 2026-01-26 09:16:55.498 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:16:55 compute-1 podman[228285]: 2026-01-26 09:16:55.820250608 +0000 UTC m=+0.071374481 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 09:16:58 compute-1 nova_compute[183083]: 2026-01-26 09:16:58.931 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:00 compute-1 nova_compute[183083]: 2026-01-26 09:17:00.540 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:03 compute-1 nova_compute[183083]: 2026-01-26 09:17:03.932 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:04 compute-1 ovn_controller[95352]: 2026-01-26T09:17:04Z|00344|pinctrl|WARN|Dropped 283 log messages in last 59 seconds (most recently, 2 seconds ago) due to excessive rate
Jan 26 09:17:04 compute-1 ovn_controller[95352]: 2026-01-26T09:17:04Z|00345|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:17:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:17:05.335 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:17:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:17:05.336 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:17:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:17:05.336 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:17:05 compute-1 nova_compute[183083]: 2026-01-26 09:17:05.542 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:17:06.462 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:17:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:17:06.464 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:17:06 compute-1 nova_compute[183083]: 2026-01-26 09:17:06.463 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:17:08.465 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:17:08 compute-1 nova_compute[183083]: 2026-01-26 09:17:08.934 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:10 compute-1 nova_compute[183083]: 2026-01-26 09:17:10.545 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:12 compute-1 podman[228311]: 2026-01-26 09:17:12.816899654 +0000 UTC m=+0.077066195 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:17:12 compute-1 podman[228313]: 2026-01-26 09:17:12.823182997 +0000 UTC m=+0.074490212 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 09:17:12 compute-1 podman[228312]: 2026-01-26 09:17:12.839376656 +0000 UTC m=+0.086143459 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, vcs-type=git, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Jan 26 09:17:12 compute-1 podman[228324]: 2026-01-26 09:17:12.83952743 +0000 UTC m=+0.075997774 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:17:12 compute-1 podman[228310]: 2026-01-26 09:17:12.845634828 +0000 UTC m=+0.103825732 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:17:13 compute-1 nova_compute[183083]: 2026-01-26 09:17:13.936 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:14 compute-1 sshd-session[228414]: Invalid user sol from 2.57.122.238 port 59024
Jan 26 09:17:14 compute-1 sshd-session[228414]: Connection closed by invalid user sol 2.57.122.238 port 59024 [preauth]
Jan 26 09:17:15 compute-1 nova_compute[183083]: 2026-01-26 09:17:15.547 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:18 compute-1 nova_compute[183083]: 2026-01-26 09:17:18.937 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:20 compute-1 sshd-session[228416]: Accepted publickey for zuul from 38.102.83.66 port 45320 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:17:20 compute-1 systemd-logind[788]: New session 137 of user zuul.
Jan 26 09:17:20 compute-1 systemd[1]: Started Session 137 of User zuul.
Jan 26 09:17:20 compute-1 sshd-session[228416]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:17:20 compute-1 sshd-session[228419]: Connection closed by 38.102.83.66 port 45320
Jan 26 09:17:20 compute-1 sshd-session[228416]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:17:20 compute-1 systemd[1]: session-137.scope: Deactivated successfully.
Jan 26 09:17:20 compute-1 systemd-logind[788]: Session 137 logged out. Waiting for processes to exit.
Jan 26 09:17:20 compute-1 systemd-logind[788]: Removed session 137.
Jan 26 09:17:20 compute-1 nova_compute[183083]: 2026-01-26 09:17:20.550 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:22 compute-1 nova_compute[183083]: 2026-01-26 09:17:22.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:17:22 compute-1 nova_compute[183083]: 2026-01-26 09:17:22.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 26 09:17:22 compute-1 nova_compute[183083]: 2026-01-26 09:17:22.969 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 26 09:17:23 compute-1 nova_compute[183083]: 2026-01-26 09:17:23.939 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:25 compute-1 nova_compute[183083]: 2026-01-26 09:17:25.552 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:26 compute-1 podman[228443]: 2026-01-26 09:17:26.822630898 +0000 UTC m=+0.079320140 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:17:26 compute-1 nova_compute[183083]: 2026-01-26 09:17:26.969 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:17:27 compute-1 nova_compute[183083]: 2026-01-26 09:17:27.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:17:27 compute-1 nova_compute[183083]: 2026-01-26 09:17:27.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:17:27 compute-1 nova_compute[183083]: 2026-01-26 09:17:27.953 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:17:27 compute-1 nova_compute[183083]: 2026-01-26 09:17:27.970 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:17:28 compute-1 nova_compute[183083]: 2026-01-26 09:17:28.957 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:29 compute-1 nova_compute[183083]: 2026-01-26 09:17:29.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:17:29 compute-1 nova_compute[183083]: 2026-01-26 09:17:29.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:17:30 compute-1 nova_compute[183083]: 2026-01-26 09:17:30.561 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:31 compute-1 nova_compute[183083]: 2026-01-26 09:17:31.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:17:31 compute-1 nova_compute[183083]: 2026-01-26 09:17:31.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:17:31 compute-1 nova_compute[183083]: 2026-01-26 09:17:31.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:17:33 compute-1 nova_compute[183083]: 2026-01-26 09:17:33.946 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:17:33 compute-1 nova_compute[183083]: 2026-01-26 09:17:33.958 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:33 compute-1 nova_compute[183083]: 2026-01-26 09:17:33.964 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:17:33 compute-1 nova_compute[183083]: 2026-01-26 09:17:33.964 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 26 09:17:35 compute-1 nova_compute[183083]: 2026-01-26 09:17:35.563 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:36 compute-1 nova_compute[183083]: 2026-01-26 09:17:36.538 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:17:36 compute-1 nova_compute[183083]: 2026-01-26 09:17:36.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:17:36 compute-1 nova_compute[183083]: 2026-01-26 09:17:36.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:17:38 compute-1 nova_compute[183083]: 2026-01-26 09:17:38.960 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:40 compute-1 nova_compute[183083]: 2026-01-26 09:17:40.566 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:40 compute-1 nova_compute[183083]: 2026-01-26 09:17:40.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:17:40 compute-1 nova_compute[183083]: 2026-01-26 09:17:40.978 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:17:40 compute-1 nova_compute[183083]: 2026-01-26 09:17:40.979 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:17:40 compute-1 nova_compute[183083]: 2026-01-26 09:17:40.979 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:17:40 compute-1 nova_compute[183083]: 2026-01-26 09:17:40.979 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:17:41 compute-1 nova_compute[183083]: 2026-01-26 09:17:41.207 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:17:41 compute-1 nova_compute[183083]: 2026-01-26 09:17:41.209 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13646MB free_disk=113.08383178710938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:17:41 compute-1 nova_compute[183083]: 2026-01-26 09:17:41.209 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:17:41 compute-1 nova_compute[183083]: 2026-01-26 09:17:41.209 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:17:41 compute-1 nova_compute[183083]: 2026-01-26 09:17:41.421 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:17:41 compute-1 nova_compute[183083]: 2026-01-26 09:17:41.422 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:17:41 compute-1 nova_compute[183083]: 2026-01-26 09:17:41.442 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:17:41 compute-1 nova_compute[183083]: 2026-01-26 09:17:41.459 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:17:41 compute-1 nova_compute[183083]: 2026-01-26 09:17:41.460 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:17:41 compute-1 nova_compute[183083]: 2026-01-26 09:17:41.460 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.030 183087 DEBUG oslo_concurrency.lockutils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Acquiring lock "2b2124a4-f262-4159-b9f5-1dda7835da11" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.030 183087 DEBUG oslo_concurrency.lockutils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Lock "2b2124a4-f262-4159-b9f5-1dda7835da11" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.068 183087 DEBUG nova.compute.manager [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.224 183087 DEBUG oslo_concurrency.lockutils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.225 183087 DEBUG oslo_concurrency.lockutils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.234 183087 DEBUG nova.virt.hardware [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.234 183087 INFO nova.compute.claims [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Claim successful on node compute-1.ctlplane.example.com
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.343 183087 DEBUG nova.compute.provider_tree [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.360 183087 DEBUG nova.scheduler.client.report [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.378 183087 DEBUG oslo_concurrency.lockutils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.379 183087 DEBUG nova.compute.manager [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.424 183087 DEBUG nova.compute.manager [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.425 183087 DEBUG nova.network.neutron [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.449 183087 INFO nova.virt.libvirt.driver [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.479 183087 DEBUG nova.compute.manager [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.568 183087 DEBUG nova.compute.manager [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.569 183087 DEBUG nova.virt.libvirt.driver [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.570 183087 INFO nova.virt.libvirt.driver [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Creating image(s)
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.570 183087 DEBUG oslo_concurrency.lockutils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Acquiring lock "/var/lib/nova/instances/2b2124a4-f262-4159-b9f5-1dda7835da11/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.570 183087 DEBUG oslo_concurrency.lockutils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Lock "/var/lib/nova/instances/2b2124a4-f262-4159-b9f5-1dda7835da11/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.571 183087 DEBUG oslo_concurrency.lockutils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Lock "/var/lib/nova/instances/2b2124a4-f262-4159-b9f5-1dda7835da11/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.571 183087 DEBUG oslo_concurrency.lockutils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:17:42 compute-1 nova_compute[183083]: 2026-01-26 09:17:42.572 183087 DEBUG oslo_concurrency.lockutils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:17:43 compute-1 nova_compute[183083]: 2026-01-26 09:17:43.089 183087 DEBUG nova.policy [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f89e2cd0e78744b7b6aa4d7777f0da0d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0f0089a5d1bf4983b0705d2900121c15', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 09:17:43 compute-1 podman[228476]: 2026-01-26 09:17:43.826097415 +0000 UTC m=+0.069685062 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 09:17:43 compute-1 podman[228470]: 2026-01-26 09:17:43.838294499 +0000 UTC m=+0.089408824 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 09:17:43 compute-1 podman[228468]: 2026-01-26 09:17:43.848388581 +0000 UTC m=+0.103192993 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute)
Jan 26 09:17:43 compute-1 podman[228467]: 2026-01-26 09:17:43.848457163 +0000 UTC m=+0.111599507 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 09:17:43 compute-1 podman[228469]: 2026-01-26 09:17:43.848519205 +0000 UTC m=+0.096520480 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, version=9.6)
Jan 26 09:17:43 compute-1 nova_compute[183083]: 2026-01-26 09:17:43.962 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.340 183087 DEBUG oslo_concurrency.lockutils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.768s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Traceback (most recent call last):
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]     raise exception.ImageUnacceptable(
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] 
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] During handling of the above exception, another exception occurred:
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] 
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Traceback (most recent call last):
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]     yield resources
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]     self.driver.spawn(context, instance, image_meta,
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]     created_instance_dir, created_disks = self._create_image(
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]     created_disks = self._create_and_inject_local_root(
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]     image.cache(fetch_func=fetch_func,
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]     self.create_image(fetch_func_sync, base, size,
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]     prepare_template(target=base, *args, **kwargs)
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]     return f(*args, **kwargs)
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]     fetch_func(target=target, *args, **kwargs)
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11]     raise exception.ImageUnacceptable(
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.341 183087 ERROR nova.compute.manager [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] 
Jan 26 09:17:44 compute-1 nova_compute[183083]: 2026-01-26 09:17:44.353 183087 DEBUG nova.network.neutron [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Successfully created port: 3386a4e4-9830-44d2-bf07-c0e98af1c824 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 09:17:45 compute-1 nova_compute[183083]: 2026-01-26 09:17:45.390 183087 DEBUG nova.network.neutron [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Successfully updated port: 3386a4e4-9830-44d2-bf07-c0e98af1c824 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 09:17:45 compute-1 nova_compute[183083]: 2026-01-26 09:17:45.406 183087 DEBUG oslo_concurrency.lockutils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Acquiring lock "refresh_cache-2b2124a4-f262-4159-b9f5-1dda7835da11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:17:45 compute-1 nova_compute[183083]: 2026-01-26 09:17:45.406 183087 DEBUG oslo_concurrency.lockutils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Acquired lock "refresh_cache-2b2124a4-f262-4159-b9f5-1dda7835da11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:17:45 compute-1 nova_compute[183083]: 2026-01-26 09:17:45.407 183087 DEBUG nova.network.neutron [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 09:17:45 compute-1 nova_compute[183083]: 2026-01-26 09:17:45.492 183087 DEBUG nova.compute.manager [req-ec2dd4ce-d8ee-4add-82fd-342b1ac28ad4 req-b6094a02-ff40-43d4-a286-4b1d818a1496 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Received event network-changed-3386a4e4-9830-44d2-bf07-c0e98af1c824 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:17:45 compute-1 nova_compute[183083]: 2026-01-26 09:17:45.493 183087 DEBUG nova.compute.manager [req-ec2dd4ce-d8ee-4add-82fd-342b1ac28ad4 req-b6094a02-ff40-43d4-a286-4b1d818a1496 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Refreshing instance network info cache due to event network-changed-3386a4e4-9830-44d2-bf07-c0e98af1c824. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:17:45 compute-1 nova_compute[183083]: 2026-01-26 09:17:45.493 183087 DEBUG oslo_concurrency.lockutils [req-ec2dd4ce-d8ee-4add-82fd-342b1ac28ad4 req-b6094a02-ff40-43d4-a286-4b1d818a1496 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-2b2124a4-f262-4159-b9f5-1dda7835da11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:17:45 compute-1 nova_compute[183083]: 2026-01-26 09:17:45.559 183087 DEBUG nova.network.neutron [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 09:17:45 compute-1 nova_compute[183083]: 2026-01-26 09:17:45.568 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.542 183087 DEBUG nova.network.neutron [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Updating instance_info_cache with network_info: [{"id": "3386a4e4-9830-44d2-bf07-c0e98af1c824", "address": "fa:16:3e:24:fe:11", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3386a4e4-98", "ovs_interfaceid": "3386a4e4-9830-44d2-bf07-c0e98af1c824", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.571 183087 DEBUG oslo_concurrency.lockutils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Releasing lock "refresh_cache-2b2124a4-f262-4159-b9f5-1dda7835da11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.571 183087 DEBUG nova.compute.manager [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Instance network_info: |[{"id": "3386a4e4-9830-44d2-bf07-c0e98af1c824", "address": "fa:16:3e:24:fe:11", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3386a4e4-98", "ovs_interfaceid": "3386a4e4-9830-44d2-bf07-c0e98af1c824", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.571 183087 DEBUG oslo_concurrency.lockutils [req-ec2dd4ce-d8ee-4add-82fd-342b1ac28ad4 req-b6094a02-ff40-43d4-a286-4b1d818a1496 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-2b2124a4-f262-4159-b9f5-1dda7835da11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.571 183087 DEBUG nova.network.neutron [req-ec2dd4ce-d8ee-4add-82fd-342b1ac28ad4 req-b6094a02-ff40-43d4-a286-4b1d818a1496 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Refreshing network info cache for port 3386a4e4-9830-44d2-bf07-c0e98af1c824 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.572 183087 INFO nova.compute.manager [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Terminating instance
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.573 183087 DEBUG nova.compute.manager [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.577 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.577 183087 INFO nova.virt.libvirt.driver [-] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Instance destroyed successfully.
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.577 183087 DEBUG nova.virt.libvirt.vif [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T09:17:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1142738773',display_name='tempest-server-test-1142738773',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1142738773',id=56,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPpFf0qswi/VCFRIa6QnKc2ECGN0LM1X+btbj7Q4M2snXnr1jxKCia//L8WN4QMCtIngnB8FvAS/euq4SUdAfsTq+9zI0s7HInZdK4Z/wZbbuDnpOG9Tt31zG8ule08jYA==',key_name='tempest-keypair-test-822795036',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0f0089a5d1bf4983b0705d2900121c15',ramdisk_id='',reservation_id='r-b2kiqmsz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-GatewayMtuTestIcmp-1750100388',owner_user_name='tempest-GatewayMtuTestIcmp-1750100388-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:17:42Z,user_data=None,user_id='f89e2cd0e78744b7b6aa4d7777f0da0d',uuid=2b2124a4-f262-4159-b9f5-1dda7835da11,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3386a4e4-9830-44d2-bf07-c0e98af1c824", "address": "fa:16:3e:24:fe:11", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3386a4e4-98", "ovs_interfaceid": "3386a4e4-9830-44d2-bf07-c0e98af1c824", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.578 183087 DEBUG nova.network.os_vif_util [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Converting VIF {"id": "3386a4e4-9830-44d2-bf07-c0e98af1c824", "address": "fa:16:3e:24:fe:11", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3386a4e4-98", "ovs_interfaceid": "3386a4e4-9830-44d2-bf07-c0e98af1c824", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.578 183087 DEBUG nova.network.os_vif_util [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:fe:11,bridge_name='br-int',has_traffic_filtering=True,id=3386a4e4-9830-44d2-bf07-c0e98af1c824,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3386a4e4-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.579 183087 DEBUG os_vif [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:fe:11,bridge_name='br-int',has_traffic_filtering=True,id=3386a4e4-9830-44d2-bf07-c0e98af1c824,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3386a4e4-98') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.580 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.580 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3386a4e4-98, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.580 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.582 183087 INFO os_vif [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:fe:11,bridge_name='br-int',has_traffic_filtering=True,id=3386a4e4-9830-44d2-bf07-c0e98af1c824,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3386a4e4-98')
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.583 183087 INFO nova.virt.libvirt.driver [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Deleting instance files /var/lib/nova/instances/2b2124a4-f262-4159-b9f5-1dda7835da11_del
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.583 183087 INFO nova.virt.libvirt.driver [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Deletion of /var/lib/nova/instances/2b2124a4-f262-4159-b9f5-1dda7835da11_del complete
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.662 183087 INFO nova.compute.manager [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Took 0.09 seconds to destroy the instance on the hypervisor.
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.664 183087 DEBUG nova.compute.claims [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Aborting claim: <nova.compute.claims.Claim object at 0x7f6cb84367f0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.664 183087 DEBUG oslo_concurrency.lockutils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.664 183087 DEBUG oslo_concurrency.lockutils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.797 183087 DEBUG nova.compute.provider_tree [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.814 183087 DEBUG nova.scheduler.client.report [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.844 183087 DEBUG oslo_concurrency.lockutils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.845 183087 DEBUG nova.compute.utils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.847 183087 ERROR nova.compute.manager [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Build of instance 2b2124a4-f262-4159-b9f5-1dda7835da11 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 2b2124a4-f262-4159-b9f5-1dda7835da11 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.847 183087 DEBUG nova.compute.manager [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.849 183087 DEBUG nova.virt.libvirt.vif [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T09:17:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1142738773',display_name='tempest-server-test-1142738773',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-server-test-1142738773',id=56,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPpFf0qswi/VCFRIa6QnKc2ECGN0LM1X+btbj7Q4M2snXnr1jxKCia//L8WN4QMCtIngnB8FvAS/euq4SUdAfsTq+9zI0s7HInZdK4Z/wZbbuDnpOG9Tt31zG8ule08jYA==',key_name='tempest-keypair-test-822795036',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0f0089a5d1bf4983b0705d2900121c15',ramdisk_id='',reservation_id='r-b2kiqmsz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-GatewayMtuTestIcmp-1750100388',owner_user_name='tempest-GatewayMtuTestIcmp-1750100388-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:17:46Z,user_data=None,user_id='f89e2cd0e78744b7b6aa4d7777f0da0d',uuid=2b2124a4-f262-4159-b9f5-1dda7835da11,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3386a4e4-9830-44d2-bf07-c0e98af1c824", "address": "fa:16:3e:24:fe:11", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3386a4e4-98", "ovs_interfaceid": "3386a4e4-9830-44d2-bf07-c0e98af1c824", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.850 183087 DEBUG nova.network.os_vif_util [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Converting VIF {"id": "3386a4e4-9830-44d2-bf07-c0e98af1c824", "address": "fa:16:3e:24:fe:11", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3386a4e4-98", "ovs_interfaceid": "3386a4e4-9830-44d2-bf07-c0e98af1c824", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.851 183087 DEBUG nova.network.os_vif_util [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:fe:11,bridge_name='br-int',has_traffic_filtering=True,id=3386a4e4-9830-44d2-bf07-c0e98af1c824,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3386a4e4-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.851 183087 DEBUG os_vif [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:fe:11,bridge_name='br-int',has_traffic_filtering=True,id=3386a4e4-9830-44d2-bf07-c0e98af1c824,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3386a4e4-98') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.853 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.854 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3386a4e4-98, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.854 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.859 183087 INFO os_vif [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:fe:11,bridge_name='br-int',has_traffic_filtering=True,id=3386a4e4-9830-44d2-bf07-c0e98af1c824,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3386a4e4-98')
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.860 183087 DEBUG nova.compute.manager [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.861 183087 DEBUG nova.compute.manager [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 09:17:46 compute-1 nova_compute[183083]: 2026-01-26 09:17:46.861 183087 DEBUG nova.network.neutron [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 09:17:47 compute-1 nova_compute[183083]: 2026-01-26 09:17:47.874 183087 DEBUG nova.network.neutron [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:17:47 compute-1 nova_compute[183083]: 2026-01-26 09:17:47.894 183087 INFO nova.compute.manager [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Took 1.03 seconds to deallocate network for instance.
Jan 26 09:17:48 compute-1 nova_compute[183083]: 2026-01-26 09:17:48.482 183087 INFO nova.scheduler.client.report [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Deleted allocations for instance 2b2124a4-f262-4159-b9f5-1dda7835da11
Jan 26 09:17:48 compute-1 nova_compute[183083]: 2026-01-26 09:17:48.482 183087 DEBUG oslo_concurrency.lockutils [None req-8ee11902-44b0-4d19-b59b-18d3f84471b3 f89e2cd0e78744b7b6aa4d7777f0da0d 0f0089a5d1bf4983b0705d2900121c15 - - default default] Lock "2b2124a4-f262-4159-b9f5-1dda7835da11" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.452s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:17:48 compute-1 nova_compute[183083]: 2026-01-26 09:17:48.795 183087 DEBUG nova.network.neutron [req-ec2dd4ce-d8ee-4add-82fd-342b1ac28ad4 req-b6094a02-ff40-43d4-a286-4b1d818a1496 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Updated VIF entry in instance network info cache for port 3386a4e4-9830-44d2-bf07-c0e98af1c824. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:17:48 compute-1 nova_compute[183083]: 2026-01-26 09:17:48.796 183087 DEBUG nova.network.neutron [req-ec2dd4ce-d8ee-4add-82fd-342b1ac28ad4 req-b6094a02-ff40-43d4-a286-4b1d818a1496 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 2b2124a4-f262-4159-b9f5-1dda7835da11] Updating instance_info_cache with network_info: [{"id": "3386a4e4-9830-44d2-bf07-c0e98af1c824", "address": "fa:16:3e:24:fe:11", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3386a4e4-98", "ovs_interfaceid": "3386a4e4-9830-44d2-bf07-c0e98af1c824", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:17:48 compute-1 nova_compute[183083]: 2026-01-26 09:17:48.824 183087 DEBUG oslo_concurrency.lockutils [req-ec2dd4ce-d8ee-4add-82fd-342b1ac28ad4 req-b6094a02-ff40-43d4-a286-4b1d818a1496 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-2b2124a4-f262-4159-b9f5-1dda7835da11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:17:48 compute-1 nova_compute[183083]: 2026-01-26 09:17:48.964 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:50 compute-1 nova_compute[183083]: 2026-01-26 09:17:50.571 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:52 compute-1 nova_compute[183083]: 2026-01-26 09:17:52.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:17:53 compute-1 nova_compute[183083]: 2026-01-26 09:17:53.965 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:55 compute-1 nova_compute[183083]: 2026-01-26 09:17:55.574 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:17:57 compute-1 podman[228570]: 2026-01-26 09:17:57.777994344 +0000 UTC m=+0.047204449 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:17:58 compute-1 nova_compute[183083]: 2026-01-26 09:17:58.968 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:00 compute-1 nova_compute[183083]: 2026-01-26 09:18:00.655 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:18:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:18:03 compute-1 nova_compute[183083]: 2026-01-26 09:18:03.969 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:04 compute-1 ovn_controller[95352]: 2026-01-26T09:18:04Z|00346|pinctrl|WARN|Dropped 297 log messages in last 60 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 26 09:18:04 compute-1 ovn_controller[95352]: 2026-01-26T09:18:04Z|00347|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:18:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:18:05.336 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:18:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:18:05.337 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:18:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:18:05.337 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:18:05 compute-1 nova_compute[183083]: 2026-01-26 09:18:05.658 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:06 compute-1 sshd-session[228594]: Connection closed by 178.62.249.31 port 35288
Jan 26 09:18:08 compute-1 nova_compute[183083]: 2026-01-26 09:18:08.971 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:18:10.073 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:18:10 compute-1 nova_compute[183083]: 2026-01-26 09:18:10.074 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:10 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:18:10.074 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:18:10 compute-1 nova_compute[183083]: 2026-01-26 09:18:10.661 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:13 compute-1 nova_compute[183083]: 2026-01-26 09:18:13.975 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:14 compute-1 ovn_controller[95352]: 2026-01-26T09:18:14Z|00348|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 26 09:18:14 compute-1 podman[228596]: 2026-01-26 09:18:14.833129498 +0000 UTC m=+0.090390102 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:18:14 compute-1 podman[228599]: 2026-01-26 09:18:14.837934688 +0000 UTC m=+0.080058333 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:18:14 compute-1 podman[228597]: 2026-01-26 09:18:14.842329775 +0000 UTC m=+0.089893397 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 26 09:18:14 compute-1 podman[228598]: 2026-01-26 09:18:14.843259232 +0000 UTC m=+0.095556372 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:18:14 compute-1 podman[228595]: 2026-01-26 09:18:14.858667289 +0000 UTC m=+0.117454417 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Jan 26 09:18:15 compute-1 nova_compute[183083]: 2026-01-26 09:18:15.663 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:18 compute-1 nova_compute[183083]: 2026-01-26 09:18:18.976 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:19 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:18:19.077 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:18:19 compute-1 sshd-session[228697]: Accepted publickey for zuul from 38.102.83.66 port 34552 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:18:19 compute-1 systemd-logind[788]: New session 138 of user zuul.
Jan 26 09:18:19 compute-1 systemd[1]: Started Session 138 of User zuul.
Jan 26 09:18:19 compute-1 sshd-session[228697]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:18:19 compute-1 sshd-session[228700]: Connection closed by 38.102.83.66 port 34552
Jan 26 09:18:19 compute-1 sshd-session[228697]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:18:19 compute-1 systemd[1]: session-138.scope: Deactivated successfully.
Jan 26 09:18:19 compute-1 systemd-logind[788]: Session 138 logged out. Waiting for processes to exit.
Jan 26 09:18:19 compute-1 systemd-logind[788]: Removed session 138.
Jan 26 09:18:20 compute-1 nova_compute[183083]: 2026-01-26 09:18:20.665 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:23 compute-1 nova_compute[183083]: 2026-01-26 09:18:23.977 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:25 compute-1 nova_compute[183083]: 2026-01-26 09:18:25.708 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:27 compute-1 nova_compute[183083]: 2026-01-26 09:18:27.985 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:18:28 compute-1 podman[228724]: 2026-01-26 09:18:28.795572675 +0000 UTC m=+0.056663364 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 09:18:28 compute-1 nova_compute[183083]: 2026-01-26 09:18:28.979 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:29 compute-1 nova_compute[183083]: 2026-01-26 09:18:29.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:18:29 compute-1 nova_compute[183083]: 2026-01-26 09:18:29.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:18:29 compute-1 nova_compute[183083]: 2026-01-26 09:18:29.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:18:29 compute-1 nova_compute[183083]: 2026-01-26 09:18:29.967 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:18:29 compute-1 nova_compute[183083]: 2026-01-26 09:18:29.968 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:18:30 compute-1 nova_compute[183083]: 2026-01-26 09:18:30.709 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:30 compute-1 nova_compute[183083]: 2026-01-26 09:18:30.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:18:31 compute-1 nova_compute[183083]: 2026-01-26 09:18:31.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:18:32 compute-1 nova_compute[183083]: 2026-01-26 09:18:32.946 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:18:32 compute-1 nova_compute[183083]: 2026-01-26 09:18:32.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:18:33 compute-1 nova_compute[183083]: 2026-01-26 09:18:33.981 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:35 compute-1 nova_compute[183083]: 2026-01-26 09:18:35.712 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:36 compute-1 nova_compute[183083]: 2026-01-26 09:18:36.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:18:36 compute-1 nova_compute[183083]: 2026-01-26 09:18:36.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:18:38 compute-1 nova_compute[183083]: 2026-01-26 09:18:38.982 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:40 compute-1 nova_compute[183083]: 2026-01-26 09:18:40.714 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:42 compute-1 nova_compute[183083]: 2026-01-26 09:18:42.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:18:42 compute-1 nova_compute[183083]: 2026-01-26 09:18:42.980 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:18:42 compute-1 nova_compute[183083]: 2026-01-26 09:18:42.981 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:18:42 compute-1 nova_compute[183083]: 2026-01-26 09:18:42.981 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:18:42 compute-1 nova_compute[183083]: 2026-01-26 09:18:42.981 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.193 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.195 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13648MB free_disk=113.08383178710938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.195 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.195 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.252 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.253 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.276 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.289 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.309 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.309 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.450 183087 DEBUG oslo_concurrency.lockutils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Acquiring lock "e51af20f-0823-4be7-abc0-7ced1570b63a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.450 183087 DEBUG oslo_concurrency.lockutils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Lock "e51af20f-0823-4be7-abc0-7ced1570b63a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.465 183087 DEBUG nova.compute.manager [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.530 183087 DEBUG oslo_concurrency.lockutils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.531 183087 DEBUG oslo_concurrency.lockutils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.575 183087 DEBUG nova.virt.hardware [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.575 183087 INFO nova.compute.claims [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Claim successful on node compute-1.ctlplane.example.com
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.681 183087 DEBUG nova.compute.provider_tree [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.695 183087 DEBUG nova.scheduler.client.report [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.716 183087 DEBUG oslo_concurrency.lockutils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.717 183087 DEBUG nova.compute.manager [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.759 183087 DEBUG nova.compute.manager [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.759 183087 DEBUG nova.network.neutron [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.781 183087 INFO nova.virt.libvirt.driver [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.799 183087 DEBUG nova.compute.manager [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.901 183087 DEBUG nova.compute.manager [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.902 183087 DEBUG nova.virt.libvirt.driver [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.903 183087 INFO nova.virt.libvirt.driver [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Creating image(s)
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.904 183087 DEBUG oslo_concurrency.lockutils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Acquiring lock "/var/lib/nova/instances/e51af20f-0823-4be7-abc0-7ced1570b63a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.905 183087 DEBUG oslo_concurrency.lockutils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Lock "/var/lib/nova/instances/e51af20f-0823-4be7-abc0-7ced1570b63a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.906 183087 DEBUG oslo_concurrency.lockutils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Lock "/var/lib/nova/instances/e51af20f-0823-4be7-abc0-7ced1570b63a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.907 183087 DEBUG oslo_concurrency.lockutils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.908 183087 DEBUG oslo_concurrency.lockutils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:18:43 compute-1 nova_compute[183083]: 2026-01-26 09:18:43.984 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.139 183087 DEBUG nova.policy [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ee9591232a0947c4bdc2f7aa67416c26', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5c8af329d40f415aba570c04085cb311', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.926 183087 DEBUG oslo_concurrency.lockutils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.018s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Traceback (most recent call last):
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]     raise exception.ImageUnacceptable(
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] 
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] During handling of the above exception, another exception occurred:
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] 
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Traceback (most recent call last):
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]     yield resources
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]     self.driver.spawn(context, instance, image_meta,
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]     created_instance_dir, created_disks = self._create_image(
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]     created_disks = self._create_and_inject_local_root(
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]     image.cache(fetch_func=fetch_func,
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]     self.create_image(fetch_func_sync, base, size,
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]     prepare_template(target=base, *args, **kwargs)
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]     return f(*args, **kwargs)
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]     fetch_func(target=target, *args, **kwargs)
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a]     raise exception.ImageUnacceptable(
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 09:18:44 compute-1 nova_compute[183083]: 2026-01-26 09:18:44.927 183087 ERROR nova.compute.manager [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] 
Jan 26 09:18:45 compute-1 nova_compute[183083]: 2026-01-26 09:18:45.138 183087 DEBUG nova.network.neutron [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Successfully created port: 8b5661b7-6299-4eb2-beee-b95a7484db4b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 09:18:45 compute-1 nova_compute[183083]: 2026-01-26 09:18:45.717 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:45 compute-1 podman[228757]: 2026-01-26 09:18:45.825854941 +0000 UTC m=+0.067504739 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 09:18:45 compute-1 podman[228750]: 2026-01-26 09:18:45.825854931 +0000 UTC m=+0.076926742 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 26 09:18:45 compute-1 podman[228749]: 2026-01-26 09:18:45.835873871 +0000 UTC m=+0.086584661 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 09:18:45 compute-1 podman[228751]: 2026-01-26 09:18:45.863104291 +0000 UTC m=+0.108729914 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 09:18:45 compute-1 podman[228748]: 2026-01-26 09:18:45.880825935 +0000 UTC m=+0.136761937 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.104 183087 DEBUG nova.network.neutron [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Successfully updated port: 8b5661b7-6299-4eb2-beee-b95a7484db4b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.123 183087 DEBUG oslo_concurrency.lockutils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Acquiring lock "refresh_cache-e51af20f-0823-4be7-abc0-7ced1570b63a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.123 183087 DEBUG oslo_concurrency.lockutils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Acquired lock "refresh_cache-e51af20f-0823-4be7-abc0-7ced1570b63a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.123 183087 DEBUG nova.network.neutron [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.208 183087 DEBUG nova.compute.manager [req-cda38754-f542-4567-8d6a-4b0f76993ad2 req-6799e0bd-bf07-4c5c-bae2-9b1f17cbeaea 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Received event network-changed-8b5661b7-6299-4eb2-beee-b95a7484db4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.208 183087 DEBUG nova.compute.manager [req-cda38754-f542-4567-8d6a-4b0f76993ad2 req-6799e0bd-bf07-4c5c-bae2-9b1f17cbeaea 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Refreshing instance network info cache due to event network-changed-8b5661b7-6299-4eb2-beee-b95a7484db4b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.209 183087 DEBUG oslo_concurrency.lockutils [req-cda38754-f542-4567-8d6a-4b0f76993ad2 req-6799e0bd-bf07-4c5c-bae2-9b1f17cbeaea 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-e51af20f-0823-4be7-abc0-7ced1570b63a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.263 183087 DEBUG nova.network.neutron [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.919 183087 DEBUG nova.network.neutron [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Updating instance_info_cache with network_info: [{"id": "8b5661b7-6299-4eb2-beee-b95a7484db4b", "address": "fa:16:3e:5c:38:37", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b5661b7-62", "ovs_interfaceid": "8b5661b7-6299-4eb2-beee-b95a7484db4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.952 183087 DEBUG oslo_concurrency.lockutils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Releasing lock "refresh_cache-e51af20f-0823-4be7-abc0-7ced1570b63a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.953 183087 DEBUG nova.compute.manager [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Instance network_info: |[{"id": "8b5661b7-6299-4eb2-beee-b95a7484db4b", "address": "fa:16:3e:5c:38:37", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b5661b7-62", "ovs_interfaceid": "8b5661b7-6299-4eb2-beee-b95a7484db4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.953 183087 DEBUG oslo_concurrency.lockutils [req-cda38754-f542-4567-8d6a-4b0f76993ad2 req-6799e0bd-bf07-4c5c-bae2-9b1f17cbeaea 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-e51af20f-0823-4be7-abc0-7ced1570b63a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.953 183087 DEBUG nova.network.neutron [req-cda38754-f542-4567-8d6a-4b0f76993ad2 req-6799e0bd-bf07-4c5c-bae2-9b1f17cbeaea 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Refreshing network info cache for port 8b5661b7-6299-4eb2-beee-b95a7484db4b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.955 183087 INFO nova.compute.manager [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Terminating instance
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.956 183087 DEBUG nova.compute.manager [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.960 183087 DEBUG nova.virt.libvirt.driver [-] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.961 183087 INFO nova.virt.libvirt.driver [-] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Instance destroyed successfully.
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.962 183087 DEBUG nova.virt.libvirt.vif [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T09:18:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-750636305',display_name='tempest-server-test-750636305',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-750636305',id=57,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNvicCA36wNZJdXDdcWDimGRZiBucHSj11NZzmfY4dBd+DbjP7Vxcu5dlO2V2k+QrjI0u19rWpFj9Mz1WB/i1o24cUFCFJd+8LOcZyTDqzTjXRENYv0zKNQFBE4DOSWzRA==',key_name='tempest-keypair-test-1269631225',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c8af329d40f415aba570c04085cb311',ramdisk_id='',reservation_id='r-71ub3d0b',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-GatewayMtuTestUdp-254629387',owner_user_name='tempest-GatewayMtuTestUdp-254629387-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:18:43Z,user_data=None,user_id='ee9591232a0947c4bdc2f7aa67416c26',uuid=e51af20f-0823-4be7-abc0-7ced1570b63a,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b5661b7-6299-4eb2-beee-b95a7484db4b", "address": "fa:16:3e:5c:38:37", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b5661b7-62", "ovs_interfaceid": "8b5661b7-6299-4eb2-beee-b95a7484db4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.962 183087 DEBUG nova.network.os_vif_util [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Converting VIF {"id": "8b5661b7-6299-4eb2-beee-b95a7484db4b", "address": "fa:16:3e:5c:38:37", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b5661b7-62", "ovs_interfaceid": "8b5661b7-6299-4eb2-beee-b95a7484db4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.963 183087 DEBUG nova.network.os_vif_util [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5c:38:37,bridge_name='br-int',has_traffic_filtering=True,id=8b5661b7-6299-4eb2-beee-b95a7484db4b,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b5661b7-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.963 183087 DEBUG os_vif [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5c:38:37,bridge_name='br-int',has_traffic_filtering=True,id=8b5661b7-6299-4eb2-beee-b95a7484db4b,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b5661b7-62') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.964 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.965 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b5661b7-62, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.965 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.970 183087 INFO os_vif [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5c:38:37,bridge_name='br-int',has_traffic_filtering=True,id=8b5661b7-6299-4eb2-beee-b95a7484db4b,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b5661b7-62')
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.971 183087 INFO nova.virt.libvirt.driver [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Deleting instance files /var/lib/nova/instances/e51af20f-0823-4be7-abc0-7ced1570b63a_del
Jan 26 09:18:46 compute-1 nova_compute[183083]: 2026-01-26 09:18:46.971 183087 INFO nova.virt.libvirt.driver [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Deletion of /var/lib/nova/instances/e51af20f-0823-4be7-abc0-7ced1570b63a_del complete
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.032 183087 INFO nova.compute.manager [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Took 0.08 seconds to destroy the instance on the hypervisor.
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.034 183087 DEBUG nova.compute.claims [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Aborting claim: <nova.compute.claims.Claim object at 0x7f6cb8755b50> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.034 183087 DEBUG oslo_concurrency.lockutils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.034 183087 DEBUG oslo_concurrency.lockutils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.127 183087 DEBUG nova.compute.provider_tree [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.142 183087 DEBUG nova.scheduler.client.report [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.160 183087 DEBUG oslo_concurrency.lockutils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.161 183087 DEBUG nova.compute.utils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.162 183087 ERROR nova.compute.manager [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Build of instance e51af20f-0823-4be7-abc0-7ced1570b63a aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance e51af20f-0823-4be7-abc0-7ced1570b63a aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.163 183087 DEBUG nova.compute.manager [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.163 183087 DEBUG nova.virt.libvirt.vif [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T09:18:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-750636305',display_name='tempest-server-test-750636305',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-server-test-750636305',id=57,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNvicCA36wNZJdXDdcWDimGRZiBucHSj11NZzmfY4dBd+DbjP7Vxcu5dlO2V2k+QrjI0u19rWpFj9Mz1WB/i1o24cUFCFJd+8LOcZyTDqzTjXRENYv0zKNQFBE4DOSWzRA==',key_name='tempest-keypair-test-1269631225',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c8af329d40f415aba570c04085cb311',ramdisk_id='',reservation_id='r-71ub3d0b',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-GatewayMtuTestUdp-254629387',owner_user_name='tempest-GatewayMtuTestUdp-254629387-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:18:47Z,user_data=None,user_id='ee9591232a0947c4bdc2f7aa67416c26',uuid=e51af20f-0823-4be7-abc0-7ced1570b63a,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b5661b7-6299-4eb2-beee-b95a7484db4b", "address": "fa:16:3e:5c:38:37", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b5661b7-62", "ovs_interfaceid": "8b5661b7-6299-4eb2-beee-b95a7484db4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.163 183087 DEBUG nova.network.os_vif_util [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Converting VIF {"id": "8b5661b7-6299-4eb2-beee-b95a7484db4b", "address": "fa:16:3e:5c:38:37", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b5661b7-62", "ovs_interfaceid": "8b5661b7-6299-4eb2-beee-b95a7484db4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.164 183087 DEBUG nova.network.os_vif_util [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5c:38:37,bridge_name='br-int',has_traffic_filtering=True,id=8b5661b7-6299-4eb2-beee-b95a7484db4b,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b5661b7-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.164 183087 DEBUG os_vif [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5c:38:37,bridge_name='br-int',has_traffic_filtering=True,id=8b5661b7-6299-4eb2-beee-b95a7484db4b,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b5661b7-62') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.166 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.166 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b5661b7-62, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.166 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.169 183087 INFO os_vif [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5c:38:37,bridge_name='br-int',has_traffic_filtering=True,id=8b5661b7-6299-4eb2-beee-b95a7484db4b,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b5661b7-62')
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.169 183087 DEBUG nova.compute.manager [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.170 183087 DEBUG nova.compute.manager [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 09:18:47 compute-1 nova_compute[183083]: 2026-01-26 09:18:47.170 183087 DEBUG nova.network.neutron [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 09:18:48 compute-1 nova_compute[183083]: 2026-01-26 09:18:48.297 183087 DEBUG nova.network.neutron [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:18:48 compute-1 nova_compute[183083]: 2026-01-26 09:18:48.314 183087 INFO nova.compute.manager [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Took 1.14 seconds to deallocate network for instance.
Jan 26 09:18:48 compute-1 nova_compute[183083]: 2026-01-26 09:18:48.504 183087 INFO nova.scheduler.client.report [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Deleted allocations for instance e51af20f-0823-4be7-abc0-7ced1570b63a
Jan 26 09:18:48 compute-1 nova_compute[183083]: 2026-01-26 09:18:48.505 183087 DEBUG oslo_concurrency.lockutils [None req-3aa9571e-eae3-4060-ac14-8b6e34a23ea0 ee9591232a0947c4bdc2f7aa67416c26 5c8af329d40f415aba570c04085cb311 - - default default] Lock "e51af20f-0823-4be7-abc0-7ced1570b63a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:18:48 compute-1 nova_compute[183083]: 2026-01-26 09:18:48.910 183087 DEBUG nova.network.neutron [req-cda38754-f542-4567-8d6a-4b0f76993ad2 req-6799e0bd-bf07-4c5c-bae2-9b1f17cbeaea 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Updated VIF entry in instance network info cache for port 8b5661b7-6299-4eb2-beee-b95a7484db4b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:18:48 compute-1 nova_compute[183083]: 2026-01-26 09:18:48.911 183087 DEBUG nova.network.neutron [req-cda38754-f542-4567-8d6a-4b0f76993ad2 req-6799e0bd-bf07-4c5c-bae2-9b1f17cbeaea 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: e51af20f-0823-4be7-abc0-7ced1570b63a] Updating instance_info_cache with network_info: [{"id": "8b5661b7-6299-4eb2-beee-b95a7484db4b", "address": "fa:16:3e:5c:38:37", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b5661b7-62", "ovs_interfaceid": "8b5661b7-6299-4eb2-beee-b95a7484db4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:18:48 compute-1 nova_compute[183083]: 2026-01-26 09:18:48.932 183087 DEBUG oslo_concurrency.lockutils [req-cda38754-f542-4567-8d6a-4b0f76993ad2 req-6799e0bd-bf07-4c5c-bae2-9b1f17cbeaea 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-e51af20f-0823-4be7-abc0-7ced1570b63a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:18:48 compute-1 nova_compute[183083]: 2026-01-26 09:18:48.985 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:50 compute-1 nova_compute[183083]: 2026-01-26 09:18:50.719 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:54 compute-1 nova_compute[183083]: 2026-01-26 09:18:54.007 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:55 compute-1 nova_compute[183083]: 2026-01-26 09:18:55.722 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:59 compute-1 nova_compute[183083]: 2026-01-26 09:18:59.009 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:18:59 compute-1 podman[228847]: 2026-01-26 09:18:59.811604934 +0000 UTC m=+0.068826517 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 09:19:00 compute-1 nova_compute[183083]: 2026-01-26 09:19:00.725 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:04 compute-1 nova_compute[183083]: 2026-01-26 09:19:04.010 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:19:05.337 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:19:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:19:05.338 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:19:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:19:05.338 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:19:05 compute-1 nova_compute[183083]: 2026-01-26 09:19:05.727 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:09 compute-1 nova_compute[183083]: 2026-01-26 09:19:09.011 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:09 compute-1 ovn_controller[95352]: 2026-01-26T09:19:09Z|00349|pinctrl|WARN|Dropped 369 log messages in last 66 seconds (most recently, 8 seconds ago) due to excessive rate
Jan 26 09:19:09 compute-1 ovn_controller[95352]: 2026-01-26T09:19:09Z|00350|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:19:10 compute-1 nova_compute[183083]: 2026-01-26 09:19:10.729 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:14 compute-1 nova_compute[183083]: 2026-01-26 09:19:14.013 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:15 compute-1 nova_compute[183083]: 2026-01-26 09:19:15.731 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:16 compute-1 podman[228880]: 2026-01-26 09:19:16.875788583 +0000 UTC m=+0.111415161 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:19:16 compute-1 podman[228873]: 2026-01-26 09:19:16.876435302 +0000 UTC m=+0.120408613 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vendor=Red Hat, Inc., config_id=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.tags=minimal rhel9)
Jan 26 09:19:16 compute-1 podman[228878]: 2026-01-26 09:19:16.881808098 +0000 UTC m=+0.127543380 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 26 09:19:16 compute-1 podman[228872]: 2026-01-26 09:19:16.884419883 +0000 UTC m=+0.131988438 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 09:19:16 compute-1 podman[228871]: 2026-01-26 09:19:16.897883684 +0000 UTC m=+0.159900458 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 09:19:19 compute-1 nova_compute[183083]: 2026-01-26 09:19:19.016 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:20 compute-1 nova_compute[183083]: 2026-01-26 09:19:20.733 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:24 compute-1 nova_compute[183083]: 2026-01-26 09:19:24.017 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:25 compute-1 nova_compute[183083]: 2026-01-26 09:19:25.735 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:19:26.128 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:19:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:19:26.129 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:19:26 compute-1 nova_compute[183083]: 2026-01-26 09:19:26.172 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:26 compute-1 sshd-session[228979]: Connection closed by authenticating user root 178.62.249.31 port 34748 [preauth]
Jan 26 09:19:29 compute-1 nova_compute[183083]: 2026-01-26 09:19:29.019 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:29 compute-1 nova_compute[183083]: 2026-01-26 09:19:29.310 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:19:29 compute-1 sshd-session[228981]: Invalid user sol from 2.57.122.238 port 46208
Jan 26 09:19:29 compute-1 sshd-session[228981]: Connection closed by invalid user sol 2.57.122.238 port 46208 [preauth]
Jan 26 09:19:29 compute-1 nova_compute[183083]: 2026-01-26 09:19:29.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:19:29 compute-1 nova_compute[183083]: 2026-01-26 09:19:29.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:19:29 compute-1 nova_compute[183083]: 2026-01-26 09:19:29.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:19:29 compute-1 nova_compute[183083]: 2026-01-26 09:19:29.967 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:19:30 compute-1 nova_compute[183083]: 2026-01-26 09:19:30.738 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:30 compute-1 podman[228983]: 2026-01-26 09:19:30.789209237 +0000 UTC m=+0.052828472 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 09:19:30 compute-1 nova_compute[183083]: 2026-01-26 09:19:30.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:19:31 compute-1 nova_compute[183083]: 2026-01-26 09:19:31.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:19:31 compute-1 nova_compute[183083]: 2026-01-26 09:19:31.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:19:32 compute-1 nova_compute[183083]: 2026-01-26 09:19:32.946 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:19:32 compute-1 nova_compute[183083]: 2026-01-26 09:19:32.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:19:34 compute-1 nova_compute[183083]: 2026-01-26 09:19:34.021 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:34 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:19:34.131 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:19:35 compute-1 nova_compute[183083]: 2026-01-26 09:19:35.741 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:35 compute-1 nova_compute[183083]: 2026-01-26 09:19:35.946 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:19:38 compute-1 nova_compute[183083]: 2026-01-26 09:19:38.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:19:38 compute-1 nova_compute[183083]: 2026-01-26 09:19:38.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:19:39 compute-1 nova_compute[183083]: 2026-01-26 09:19:39.023 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:40 compute-1 nova_compute[183083]: 2026-01-26 09:19:40.743 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:43 compute-1 nova_compute[183083]: 2026-01-26 09:19:43.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:19:43 compute-1 nova_compute[183083]: 2026-01-26 09:19:43.986 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:19:43 compute-1 nova_compute[183083]: 2026-01-26 09:19:43.986 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:19:43 compute-1 nova_compute[183083]: 2026-01-26 09:19:43.987 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:19:43 compute-1 nova_compute[183083]: 2026-01-26 09:19:43.987 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:19:44 compute-1 nova_compute[183083]: 2026-01-26 09:19:44.024 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:44 compute-1 nova_compute[183083]: 2026-01-26 09:19:44.274 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:19:44 compute-1 nova_compute[183083]: 2026-01-26 09:19:44.276 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13644MB free_disk=113.08383178710938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:19:44 compute-1 nova_compute[183083]: 2026-01-26 09:19:44.276 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:19:44 compute-1 nova_compute[183083]: 2026-01-26 09:19:44.277 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:19:44 compute-1 nova_compute[183083]: 2026-01-26 09:19:44.355 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:19:44 compute-1 nova_compute[183083]: 2026-01-26 09:19:44.356 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:19:44 compute-1 nova_compute[183083]: 2026-01-26 09:19:44.387 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:19:44 compute-1 nova_compute[183083]: 2026-01-26 09:19:44.409 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:19:44 compute-1 nova_compute[183083]: 2026-01-26 09:19:44.444 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:19:44 compute-1 nova_compute[183083]: 2026-01-26 09:19:44.445 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:19:45 compute-1 nova_compute[183083]: 2026-01-26 09:19:45.745 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:46 compute-1 sshd-session[229007]: Accepted publickey for zuul from 38.102.83.66 port 55044 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:19:46 compute-1 systemd-logind[788]: New session 139 of user zuul.
Jan 26 09:19:46 compute-1 systemd[1]: Started Session 139 of User zuul.
Jan 26 09:19:46 compute-1 sshd-session[229007]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:19:46 compute-1 sshd-session[229010]: Connection closed by 38.102.83.66 port 55044
Jan 26 09:19:46 compute-1 sshd-session[229007]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:19:46 compute-1 systemd[1]: session-139.scope: Deactivated successfully.
Jan 26 09:19:46 compute-1 systemd-logind[788]: Session 139 logged out. Waiting for processes to exit.
Jan 26 09:19:46 compute-1 systemd-logind[788]: Removed session 139.
Jan 26 09:19:47 compute-1 podman[229043]: 2026-01-26 09:19:47.836262559 +0000 UTC m=+0.073430851 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:19:47 compute-1 podman[229035]: 2026-01-26 09:19:47.848418341 +0000 UTC m=+0.097556310 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 26 09:19:47 compute-1 podman[229037]: 2026-01-26 09:19:47.85733915 +0000 UTC m=+0.096410917 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 09:19:47 compute-1 podman[229036]: 2026-01-26 09:19:47.857819284 +0000 UTC m=+0.100380902 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, version=9.6, managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.)
Jan 26 09:19:47 compute-1 podman[229034]: 2026-01-26 09:19:47.878852334 +0000 UTC m=+0.133163713 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 09:19:49 compute-1 nova_compute[183083]: 2026-01-26 09:19:49.026 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:50 compute-1 nova_compute[183083]: 2026-01-26 09:19:50.748 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:54 compute-1 nova_compute[183083]: 2026-01-26 09:19:54.029 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:55 compute-1 nova_compute[183083]: 2026-01-26 09:19:55.749 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:19:59 compute-1 nova_compute[183083]: 2026-01-26 09:19:59.031 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:00 compute-1 nova_compute[183083]: 2026-01-26 09:20:00.752 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:01 compute-1 podman[229137]: 2026-01-26 09:20:01.819544379 +0000 UTC m=+0.076345725 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:20:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:20:04 compute-1 nova_compute[183083]: 2026-01-26 09:20:04.033 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:04 compute-1 ovn_controller[95352]: 2026-01-26T09:20:04Z|00351|pinctrl|WARN|Dropped 369 log messages in last 55 seconds (most recently, 6 seconds ago) due to excessive rate
Jan 26 09:20:04 compute-1 ovn_controller[95352]: 2026-01-26T09:20:04Z|00352|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:20:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:20:05.338 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:20:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:20:05.339 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:20:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:20:05.339 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:20:05 compute-1 nova_compute[183083]: 2026-01-26 09:20:05.755 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:09 compute-1 nova_compute[183083]: 2026-01-26 09:20:09.036 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:10 compute-1 nova_compute[183083]: 2026-01-26 09:20:10.757 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:14 compute-1 nova_compute[183083]: 2026-01-26 09:20:14.038 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:15 compute-1 nova_compute[183083]: 2026-01-26 09:20:15.758 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:18 compute-1 podman[229167]: 2026-01-26 09:20:18.842016199 +0000 UTC m=+0.087682683 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 26 09:20:18 compute-1 podman[229168]: 2026-01-26 09:20:18.844531142 +0000 UTC m=+0.083827632 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:20:18 compute-1 podman[229165]: 2026-01-26 09:20:18.846946882 +0000 UTC m=+0.099436384 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 26 09:20:18 compute-1 podman[229166]: 2026-01-26 09:20:18.866771867 +0000 UTC m=+0.117747945 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, version=9.6, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.)
Jan 26 09:20:18 compute-1 podman[229164]: 2026-01-26 09:20:18.866953612 +0000 UTC m=+0.123340817 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 09:20:19 compute-1 nova_compute[183083]: 2026-01-26 09:20:19.040 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:19 compute-1 sshd-session[229264]: Connection closed by authenticating user root 178.62.249.31 port 35500 [preauth]
Jan 26 09:20:20 compute-1 nova_compute[183083]: 2026-01-26 09:20:20.761 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:24 compute-1 nova_compute[183083]: 2026-01-26 09:20:24.042 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:25 compute-1 nova_compute[183083]: 2026-01-26 09:20:25.763 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:20:26.304 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:20:26 compute-1 nova_compute[183083]: 2026-01-26 09:20:26.305 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:20:26.306 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:20:29 compute-1 nova_compute[183083]: 2026-01-26 09:20:29.089 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:30 compute-1 nova_compute[183083]: 2026-01-26 09:20:30.764 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:31 compute-1 nova_compute[183083]: 2026-01-26 09:20:31.446 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:20:31 compute-1 nova_compute[183083]: 2026-01-26 09:20:31.446 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:20:31 compute-1 nova_compute[183083]: 2026-01-26 09:20:31.447 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:20:31 compute-1 nova_compute[183083]: 2026-01-26 09:20:31.465 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:20:31 compute-1 nova_compute[183083]: 2026-01-26 09:20:31.465 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:20:32 compute-1 podman[229266]: 2026-01-26 09:20:32.834987773 +0000 UTC m=+0.086257603 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 09:20:32 compute-1 nova_compute[183083]: 2026-01-26 09:20:32.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:20:32 compute-1 nova_compute[183083]: 2026-01-26 09:20:32.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:20:32 compute-1 nova_compute[183083]: 2026-01-26 09:20:32.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:20:32 compute-1 nova_compute[183083]: 2026-01-26 09:20:32.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:20:32 compute-1 nova_compute[183083]: 2026-01-26 09:20:32.953 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:20:34 compute-1 nova_compute[183083]: 2026-01-26 09:20:34.090 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:35 compute-1 nova_compute[183083]: 2026-01-26 09:20:35.768 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:36 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:20:36.308 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:20:38 compute-1 nova_compute[183083]: 2026-01-26 09:20:38.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:20:38 compute-1 nova_compute[183083]: 2026-01-26 09:20:38.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:20:39 compute-1 nova_compute[183083]: 2026-01-26 09:20:39.091 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:39 compute-1 sshd-session[229289]: Accepted publickey for zuul from 38.102.83.66 port 38876 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:20:39 compute-1 systemd-logind[788]: New session 140 of user zuul.
Jan 26 09:20:39 compute-1 systemd[1]: Started Session 140 of User zuul.
Jan 26 09:20:39 compute-1 sshd-session[229289]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:20:39 compute-1 sshd-session[229292]: Connection closed by 38.102.83.66 port 38876
Jan 26 09:20:39 compute-1 sshd-session[229289]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:20:39 compute-1 systemd[1]: session-140.scope: Deactivated successfully.
Jan 26 09:20:39 compute-1 systemd-logind[788]: Session 140 logged out. Waiting for processes to exit.
Jan 26 09:20:39 compute-1 systemd-logind[788]: Removed session 140.
Jan 26 09:20:40 compute-1 nova_compute[183083]: 2026-01-26 09:20:40.769 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:43 compute-1 nova_compute[183083]: 2026-01-26 09:20:43.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:20:43 compute-1 nova_compute[183083]: 2026-01-26 09:20:43.979 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:20:43 compute-1 nova_compute[183083]: 2026-01-26 09:20:43.980 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:20:43 compute-1 nova_compute[183083]: 2026-01-26 09:20:43.981 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:20:43 compute-1 nova_compute[183083]: 2026-01-26 09:20:43.981 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:20:44 compute-1 nova_compute[183083]: 2026-01-26 09:20:44.092 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:44 compute-1 nova_compute[183083]: 2026-01-26 09:20:44.138 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:20:44 compute-1 nova_compute[183083]: 2026-01-26 09:20:44.139 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13640MB free_disk=113.08383178710938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:20:44 compute-1 nova_compute[183083]: 2026-01-26 09:20:44.139 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:20:44 compute-1 nova_compute[183083]: 2026-01-26 09:20:44.139 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:20:44 compute-1 nova_compute[183083]: 2026-01-26 09:20:44.303 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:20:44 compute-1 nova_compute[183083]: 2026-01-26 09:20:44.304 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:20:44 compute-1 nova_compute[183083]: 2026-01-26 09:20:44.318 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing inventories for resource provider 5203935e-446c-4e03-93fa-4c60d651e045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 26 09:20:44 compute-1 nova_compute[183083]: 2026-01-26 09:20:44.341 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating ProviderTree inventory for provider 5203935e-446c-4e03-93fa-4c60d651e045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 26 09:20:44 compute-1 nova_compute[183083]: 2026-01-26 09:20:44.342 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating inventory in ProviderTree for provider 5203935e-446c-4e03-93fa-4c60d651e045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 09:20:44 compute-1 nova_compute[183083]: 2026-01-26 09:20:44.371 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing aggregate associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 26 09:20:44 compute-1 nova_compute[183083]: 2026-01-26 09:20:44.412 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing trait associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 26 09:20:44 compute-1 nova_compute[183083]: 2026-01-26 09:20:44.433 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:20:44 compute-1 nova_compute[183083]: 2026-01-26 09:20:44.456 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:20:44 compute-1 nova_compute[183083]: 2026-01-26 09:20:44.457 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:20:44 compute-1 nova_compute[183083]: 2026-01-26 09:20:44.458 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:20:45 compute-1 nova_compute[183083]: 2026-01-26 09:20:45.772 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:49 compute-1 nova_compute[183083]: 2026-01-26 09:20:49.094 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:49 compute-1 podman[229317]: 2026-01-26 09:20:49.830090956 +0000 UTC m=+0.083608075 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 26 09:20:49 compute-1 podman[229324]: 2026-01-26 09:20:49.852430164 +0000 UTC m=+0.096885640 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 09:20:49 compute-1 podman[229316]: 2026-01-26 09:20:49.857175742 +0000 UTC m=+0.116911652 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 26 09:20:49 compute-1 podman[229318]: 2026-01-26 09:20:49.857301375 +0000 UTC m=+0.096923991 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, version=9.6)
Jan 26 09:20:49 compute-1 podman[229330]: 2026-01-26 09:20:49.867922903 +0000 UTC m=+0.105766478 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:20:50 compute-1 nova_compute[183083]: 2026-01-26 09:20:50.811 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:54 compute-1 nova_compute[183083]: 2026-01-26 09:20:54.097 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:55 compute-1 nova_compute[183083]: 2026-01-26 09:20:55.850 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:20:59 compute-1 nova_compute[183083]: 2026-01-26 09:20:59.099 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:00 compute-1 nova_compute[183083]: 2026-01-26 09:21:00.965 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:03 compute-1 podman[229419]: 2026-01-26 09:21:03.797052436 +0000 UTC m=+0.052774051 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 09:21:04 compute-1 nova_compute[183083]: 2026-01-26 09:21:04.100 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:21:05.339 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:21:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:21:05.340 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:21:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:21:05.340 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:21:05 compute-1 nova_compute[183083]: 2026-01-26 09:21:05.967 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:06 compute-1 ovn_controller[95352]: 2026-01-26T09:21:06Z|00353|pinctrl|WARN|Dropped 345 log messages in last 61 seconds (most recently, 4 seconds ago) due to excessive rate
Jan 26 09:21:06 compute-1 ovn_controller[95352]: 2026-01-26T09:21:06Z|00354|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:21:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:21:06.238 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:21:06 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:21:06.239 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:21:06 compute-1 nova_compute[183083]: 2026-01-26 09:21:06.271 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:09 compute-1 nova_compute[183083]: 2026-01-26 09:21:09.102 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:10 compute-1 nova_compute[183083]: 2026-01-26 09:21:10.969 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:11 compute-1 sshd-session[229444]: Connection closed by authenticating user root 178.62.249.31 port 36222 [preauth]
Jan 26 09:21:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:21:11.241 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:21:14 compute-1 nova_compute[183083]: 2026-01-26 09:21:14.103 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:15 compute-1 nova_compute[183083]: 2026-01-26 09:21:15.971 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:16 compute-1 sshd-session[229446]: Accepted publickey for zuul from 38.102.83.66 port 48936 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:21:16 compute-1 systemd-logind[788]: New session 141 of user zuul.
Jan 26 09:21:16 compute-1 systemd[1]: Started Session 141 of User zuul.
Jan 26 09:21:16 compute-1 sshd-session[229446]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:21:16 compute-1 sshd-session[229449]: Connection closed by 38.102.83.66 port 48936
Jan 26 09:21:16 compute-1 sshd-session[229446]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:21:16 compute-1 systemd[1]: session-141.scope: Deactivated successfully.
Jan 26 09:21:16 compute-1 systemd-logind[788]: Session 141 logged out. Waiting for processes to exit.
Jan 26 09:21:16 compute-1 systemd-logind[788]: Removed session 141.
Jan 26 09:21:19 compute-1 nova_compute[183083]: 2026-01-26 09:21:19.105 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:20 compute-1 podman[229474]: 2026-01-26 09:21:20.801003314 +0000 UTC m=+0.063402699 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 09:21:20 compute-1 podman[229475]: 2026-01-26 09:21:20.806954017 +0000 UTC m=+0.067126128 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Jan 26 09:21:20 compute-1 podman[229476]: 2026-01-26 09:21:20.810968473 +0000 UTC m=+0.064859782 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 09:21:20 compute-1 podman[229482]: 2026-01-26 09:21:20.811295403 +0000 UTC m=+0.063146462 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 09:21:20 compute-1 podman[229473]: 2026-01-26 09:21:20.827973616 +0000 UTC m=+0.093015578 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 26 09:21:20 compute-1 nova_compute[183083]: 2026-01-26 09:21:20.972 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:24 compute-1 nova_compute[183083]: 2026-01-26 09:21:24.106 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:25 compute-1 nova_compute[183083]: 2026-01-26 09:21:25.973 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:29 compute-1 nova_compute[183083]: 2026-01-26 09:21:29.108 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:30 compute-1 nova_compute[183083]: 2026-01-26 09:21:30.976 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:31 compute-1 nova_compute[183083]: 2026-01-26 09:21:31.458 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:21:31 compute-1 nova_compute[183083]: 2026-01-26 09:21:31.458 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:21:31 compute-1 nova_compute[183083]: 2026-01-26 09:21:31.459 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:21:31 compute-1 nova_compute[183083]: 2026-01-26 09:21:31.475 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:21:32 compute-1 nova_compute[183083]: 2026-01-26 09:21:32.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:21:32 compute-1 nova_compute[183083]: 2026-01-26 09:21:32.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:21:32 compute-1 nova_compute[183083]: 2026-01-26 09:21:32.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:21:32 compute-1 nova_compute[183083]: 2026-01-26 09:21:32.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:21:33 compute-1 nova_compute[183083]: 2026-01-26 09:21:33.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:21:34 compute-1 nova_compute[183083]: 2026-01-26 09:21:34.111 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:34 compute-1 podman[229578]: 2026-01-26 09:21:34.84920787 +0000 UTC m=+0.105549163 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 09:21:34 compute-1 nova_compute[183083]: 2026-01-26 09:21:34.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:21:35 compute-1 nova_compute[183083]: 2026-01-26 09:21:35.977 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:38 compute-1 sshd-session[229605]: Invalid user sol from 2.57.122.238 port 43972
Jan 26 09:21:38 compute-1 sshd-session[229605]: Connection closed by invalid user sol 2.57.122.238 port 43972 [preauth]
Jan 26 09:21:39 compute-1 nova_compute[183083]: 2026-01-26 09:21:39.113 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:39 compute-1 nova_compute[183083]: 2026-01-26 09:21:39.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:21:39 compute-1 nova_compute[183083]: 2026-01-26 09:21:39.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:21:40 compute-1 nova_compute[183083]: 2026-01-26 09:21:40.948 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:21:40 compute-1 nova_compute[183083]: 2026-01-26 09:21:40.979 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:44 compute-1 nova_compute[183083]: 2026-01-26 09:21:44.115 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:45 compute-1 nova_compute[183083]: 2026-01-26 09:21:45.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:21:45 compute-1 nova_compute[183083]: 2026-01-26 09:21:45.978 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:21:45 compute-1 nova_compute[183083]: 2026-01-26 09:21:45.978 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:21:45 compute-1 nova_compute[183083]: 2026-01-26 09:21:45.978 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:21:45 compute-1 nova_compute[183083]: 2026-01-26 09:21:45.979 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:21:45 compute-1 nova_compute[183083]: 2026-01-26 09:21:45.981 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:46 compute-1 nova_compute[183083]: 2026-01-26 09:21:46.113 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:21:46 compute-1 nova_compute[183083]: 2026-01-26 09:21:46.114 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13649MB free_disk=113.08382797241211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:21:46 compute-1 nova_compute[183083]: 2026-01-26 09:21:46.114 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:21:46 compute-1 nova_compute[183083]: 2026-01-26 09:21:46.114 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:21:46 compute-1 nova_compute[183083]: 2026-01-26 09:21:46.201 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:21:46 compute-1 nova_compute[183083]: 2026-01-26 09:21:46.201 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:21:46 compute-1 nova_compute[183083]: 2026-01-26 09:21:46.220 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:21:46 compute-1 nova_compute[183083]: 2026-01-26 09:21:46.233 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:21:46 compute-1 nova_compute[183083]: 2026-01-26 09:21:46.234 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:21:46 compute-1 nova_compute[183083]: 2026-01-26 09:21:46.234 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:21:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:21:46.677 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:21:46 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:21:46.677 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:21:46 compute-1 nova_compute[183083]: 2026-01-26 09:21:46.708 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:49 compute-1 nova_compute[183083]: 2026-01-26 09:21:49.117 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:50 compute-1 nova_compute[183083]: 2026-01-26 09:21:50.983 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:51 compute-1 podman[229609]: 2026-01-26 09:21:51.819752082 +0000 UTC m=+0.072901400 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, version=9.6)
Jan 26 09:21:51 compute-1 podman[229616]: 2026-01-26 09:21:51.819973939 +0000 UTC m=+0.065914673 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 09:21:51 compute-1 podman[229610]: 2026-01-26 09:21:51.819957968 +0000 UTC m=+0.066592672 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:21:51 compute-1 podman[229607]: 2026-01-26 09:21:51.834106238 +0000 UTC m=+0.095488788 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 26 09:21:51 compute-1 podman[229608]: 2026-01-26 09:21:51.849994377 +0000 UTC m=+0.103750712 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 26 09:21:52 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:21:52.678 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:21:54 compute-1 nova_compute[183083]: 2026-01-26 09:21:54.119 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:55 compute-1 nova_compute[183083]: 2026-01-26 09:21:55.984 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:21:59 compute-1 nova_compute[183083]: 2026-01-26 09:21:59.123 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:00 compute-1 nova_compute[183083]: 2026-01-26 09:22:00.986 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:01 compute-1 sshd-session[229711]: Connection closed by authenticating user root 178.62.249.31 port 38254 [preauth]
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:22:03.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:22:04 compute-1 nova_compute[183083]: 2026-01-26 09:22:04.169 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:04 compute-1 ovn_controller[95352]: 2026-01-26T09:22:04Z|00355|pinctrl|WARN|Dropped 441 log messages in last 59 seconds (most recently, 2 seconds ago) due to excessive rate
Jan 26 09:22:04 compute-1 ovn_controller[95352]: 2026-01-26T09:22:04Z|00356|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:22:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:05.340 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:22:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:05.341 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:22:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:05.341 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:22:05 compute-1 podman[229713]: 2026-01-26 09:22:05.789328466 +0000 UTC m=+0.054897041 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 09:22:05 compute-1 nova_compute[183083]: 2026-01-26 09:22:05.988 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:09 compute-1 nova_compute[183083]: 2026-01-26 09:22:09.171 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:10 compute-1 nova_compute[183083]: 2026-01-26 09:22:10.990 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:14 compute-1 nova_compute[183083]: 2026-01-26 09:22:14.173 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:15 compute-1 nova_compute[183083]: 2026-01-26 09:22:15.992 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:19 compute-1 nova_compute[183083]: 2026-01-26 09:22:19.175 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:20 compute-1 nova_compute[183083]: 2026-01-26 09:22:20.993 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:22 compute-1 podman[229738]: 2026-01-26 09:22:22.812008012 +0000 UTC m=+0.070810241 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 26 09:22:22 compute-1 podman[229745]: 2026-01-26 09:22:22.818390242 +0000 UTC m=+0.064958975 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:22:22 compute-1 podman[229751]: 2026-01-26 09:22:22.818389972 +0000 UTC m=+0.061683293 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 09:22:22 compute-1 podman[229739]: 2026-01-26 09:22:22.866784139 +0000 UTC m=+0.122106990 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, name=ubi9-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 09:22:22 compute-1 podman[229737]: 2026-01-26 09:22:22.883944564 +0000 UTC m=+0.149002120 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 26 09:22:24 compute-1 nova_compute[183083]: 2026-01-26 09:22:24.178 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:25 compute-1 nova_compute[183083]: 2026-01-26 09:22:25.995 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:29 compute-1 nova_compute[183083]: 2026-01-26 09:22:29.215 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:30 compute-1 nova_compute[183083]: 2026-01-26 09:22:30.997 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:31 compute-1 nova_compute[183083]: 2026-01-26 09:22:31.235 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:22:31 compute-1 nova_compute[183083]: 2026-01-26 09:22:31.236 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:22:31 compute-1 nova_compute[183083]: 2026-01-26 09:22:31.236 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:22:31 compute-1 nova_compute[183083]: 2026-01-26 09:22:31.248 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:22:32 compute-1 nova_compute[183083]: 2026-01-26 09:22:32.400 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Acquiring lock "b77705ce-8f65-4ffd-8131-b8526e3f84be" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:22:32 compute-1 nova_compute[183083]: 2026-01-26 09:22:32.401 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lock "b77705ce-8f65-4ffd-8131-b8526e3f84be" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:22:32 compute-1 nova_compute[183083]: 2026-01-26 09:22:32.523 183087 DEBUG nova.compute.manager [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 09:22:32 compute-1 nova_compute[183083]: 2026-01-26 09:22:32.623 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:22:32 compute-1 nova_compute[183083]: 2026-01-26 09:22:32.623 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:22:32 compute-1 nova_compute[183083]: 2026-01-26 09:22:32.630 183087 DEBUG nova.virt.hardware [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 09:22:32 compute-1 nova_compute[183083]: 2026-01-26 09:22:32.631 183087 INFO nova.compute.claims [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Claim successful on node compute-1.ctlplane.example.com
Jan 26 09:22:32 compute-1 nova_compute[183083]: 2026-01-26 09:22:32.800 183087 DEBUG nova.compute.provider_tree [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:22:32 compute-1 nova_compute[183083]: 2026-01-26 09:22:32.817 183087 DEBUG nova.scheduler.client.report [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:22:32 compute-1 nova_compute[183083]: 2026-01-26 09:22:32.862 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:22:32 compute-1 nova_compute[183083]: 2026-01-26 09:22:32.863 183087 DEBUG nova.compute.manager [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 09:22:32 compute-1 nova_compute[183083]: 2026-01-26 09:22:32.905 183087 DEBUG nova.compute.manager [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 09:22:32 compute-1 nova_compute[183083]: 2026-01-26 09:22:32.905 183087 DEBUG nova.network.neutron [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 09:22:32 compute-1 nova_compute[183083]: 2026-01-26 09:22:32.928 183087 INFO nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 09:22:32 compute-1 nova_compute[183083]: 2026-01-26 09:22:32.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:22:32 compute-1 nova_compute[183083]: 2026-01-26 09:22:32.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:22:32 compute-1 nova_compute[183083]: 2026-01-26 09:22:32.953 183087 DEBUG nova.compute.manager [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.072 183087 DEBUG nova.compute.manager [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.073 183087 DEBUG nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.074 183087 INFO nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Creating image(s)
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.074 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Acquiring lock "/var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.074 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lock "/var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.075 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lock "/var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.087 183087 DEBUG oslo_concurrency.processutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.153 183087 DEBUG oslo_concurrency.processutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.154 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.155 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.166 183087 DEBUG oslo_concurrency.processutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.231 183087 DEBUG oslo_concurrency.processutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.233 183087 DEBUG oslo_concurrency.processutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.298 183087 DEBUG oslo_concurrency.processutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/disk 1073741824" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.299 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.300 183087 DEBUG oslo_concurrency.processutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.346 183087 DEBUG nova.policy [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '984261c8d6c2480eb7dcce1e8474cecf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '930885dc59f549af8c627aca841fd798', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.367 183087 DEBUG oslo_concurrency.processutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.368 183087 DEBUG nova.virt.disk.api [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Checking if we can resize image /var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.368 183087 DEBUG oslo_concurrency.processutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.432 183087 DEBUG oslo_concurrency.processutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.434 183087 DEBUG nova.virt.disk.api [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Cannot resize image /var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.434 183087 DEBUG nova.objects.instance [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lazy-loading 'migration_context' on Instance uuid b77705ce-8f65-4ffd-8131-b8526e3f84be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.478 183087 DEBUG nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.479 183087 DEBUG nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Ensure instance console log exists: /var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.479 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.480 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:22:33 compute-1 nova_compute[183083]: 2026-01-26 09:22:33.480 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:22:34 compute-1 nova_compute[183083]: 2026-01-26 09:22:34.260 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:34 compute-1 nova_compute[183083]: 2026-01-26 09:22:34.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:22:34 compute-1 nova_compute[183083]: 2026-01-26 09:22:34.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:22:34 compute-1 nova_compute[183083]: 2026-01-26 09:22:34.953 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:22:34 compute-1 nova_compute[183083]: 2026-01-26 09:22:34.953 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 26 09:22:34 compute-1 nova_compute[183083]: 2026-01-26 09:22:34.979 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 26 09:22:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:35.482 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:22:35 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:35.483 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:22:35 compute-1 nova_compute[183083]: 2026-01-26 09:22:35.533 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:35 compute-1 nova_compute[183083]: 2026-01-26 09:22:35.978 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:22:35 compute-1 nova_compute[183083]: 2026-01-26 09:22:35.978 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:22:35 compute-1 nova_compute[183083]: 2026-01-26 09:22:35.999 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:36 compute-1 nova_compute[183083]: 2026-01-26 09:22:36.418 183087 DEBUG nova.network.neutron [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Successfully created port: aeb66cb7-241c-4d27-ba90-6748efb274ac _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 09:22:36 compute-1 podman[229854]: 2026-01-26 09:22:36.795630611 +0000 UTC m=+0.058933045 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:22:37 compute-1 nova_compute[183083]: 2026-01-26 09:22:37.982 183087 DEBUG nova.network.neutron [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Successfully updated port: aeb66cb7-241c-4d27-ba90-6748efb274ac _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 09:22:38 compute-1 nova_compute[183083]: 2026-01-26 09:22:38.005 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Acquiring lock "refresh_cache-b77705ce-8f65-4ffd-8131-b8526e3f84be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:22:38 compute-1 nova_compute[183083]: 2026-01-26 09:22:38.005 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Acquired lock "refresh_cache-b77705ce-8f65-4ffd-8131-b8526e3f84be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:22:38 compute-1 nova_compute[183083]: 2026-01-26 09:22:38.006 183087 DEBUG nova.network.neutron [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 09:22:38 compute-1 nova_compute[183083]: 2026-01-26 09:22:38.119 183087 DEBUG nova.compute.manager [req-6c300faa-da1d-4ec6-a11c-787bf99eb4e6 req-65d99f6e-226b-4329-b5c0-41c3d1563ce4 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Received event network-changed-aeb66cb7-241c-4d27-ba90-6748efb274ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:22:38 compute-1 nova_compute[183083]: 2026-01-26 09:22:38.120 183087 DEBUG nova.compute.manager [req-6c300faa-da1d-4ec6-a11c-787bf99eb4e6 req-65d99f6e-226b-4329-b5c0-41c3d1563ce4 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Refreshing instance network info cache due to event network-changed-aeb66cb7-241c-4d27-ba90-6748efb274ac. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:22:38 compute-1 nova_compute[183083]: 2026-01-26 09:22:38.120 183087 DEBUG oslo_concurrency.lockutils [req-6c300faa-da1d-4ec6-a11c-787bf99eb4e6 req-65d99f6e-226b-4329-b5c0-41c3d1563ce4 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-b77705ce-8f65-4ffd-8131-b8526e3f84be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:22:38 compute-1 nova_compute[183083]: 2026-01-26 09:22:38.187 183087 DEBUG nova.network.neutron [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 09:22:39 compute-1 nova_compute[183083]: 2026-01-26 09:22:39.308 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:39 compute-1 nova_compute[183083]: 2026-01-26 09:22:39.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:22:39 compute-1 nova_compute[183083]: 2026-01-26 09:22:39.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.341 183087 DEBUG nova.network.neutron [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Updating instance_info_cache with network_info: [{"id": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "address": "fa:16:3e:1a:f9:ee", "network": {"id": "310f9139-4565-46ad-b618-cb4acd1f9f3c", "bridge": "br-int", "label": "tempest-test-network--1642810949", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "930885dc59f549af8c627aca841fd798", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb66cb7-24", "ovs_interfaceid": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:22:40 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:40.485 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.587 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Releasing lock "refresh_cache-b77705ce-8f65-4ffd-8131-b8526e3f84be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.587 183087 DEBUG nova.compute.manager [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Instance network_info: |[{"id": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "address": "fa:16:3e:1a:f9:ee", "network": {"id": "310f9139-4565-46ad-b618-cb4acd1f9f3c", "bridge": "br-int", "label": "tempest-test-network--1642810949", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "930885dc59f549af8c627aca841fd798", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb66cb7-24", "ovs_interfaceid": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.588 183087 DEBUG oslo_concurrency.lockutils [req-6c300faa-da1d-4ec6-a11c-787bf99eb4e6 req-65d99f6e-226b-4329-b5c0-41c3d1563ce4 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-b77705ce-8f65-4ffd-8131-b8526e3f84be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.589 183087 DEBUG nova.network.neutron [req-6c300faa-da1d-4ec6-a11c-787bf99eb4e6 req-65d99f6e-226b-4329-b5c0-41c3d1563ce4 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Refreshing network info cache for port aeb66cb7-241c-4d27-ba90-6748efb274ac _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.595 183087 DEBUG nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Start _get_guest_xml network_info=[{"id": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "address": "fa:16:3e:1a:f9:ee", "network": {"id": "310f9139-4565-46ad-b618-cb4acd1f9f3c", "bridge": "br-int", "label": "tempest-test-network--1642810949", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "930885dc59f549af8c627aca841fd798", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb66cb7-24", "ovs_interfaceid": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encryption_options': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.602 183087 WARNING nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.607 183087 DEBUG nova.virt.libvirt.host [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.607 183087 DEBUG nova.virt.libvirt.host [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.610 183087 DEBUG nova.virt.libvirt.host [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.611 183087 DEBUG nova.virt.libvirt.host [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.611 183087 DEBUG nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.611 183087 DEBUG nova.virt.hardware [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T08:43:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7017323-cd35-470d-a718-faa6c6e97277',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.612 183087 DEBUG nova.virt.hardware [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.612 183087 DEBUG nova.virt.hardware [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.612 183087 DEBUG nova.virt.hardware [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.612 183087 DEBUG nova.virt.hardware [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.613 183087 DEBUG nova.virt.hardware [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.613 183087 DEBUG nova.virt.hardware [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.613 183087 DEBUG nova.virt.hardware [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.613 183087 DEBUG nova.virt.hardware [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.613 183087 DEBUG nova.virt.hardware [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.614 183087 DEBUG nova.virt.hardware [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.617 183087 DEBUG nova.virt.libvirt.vif [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T09:22:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1442720338',display_name='tempest-server-test-1442720338',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1442720338',id=61,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ+Um6JtlVrQsgbR1BDAUGP54MrmURZbG6XORR8dNN8m2jKCrMlVoDJjy1qom71CGwRnSEvfSwUYH2J/WE90y+AM7LbCl2siZ6JwO9g4XvPTE7ptzSM3k34WQsNo2WW+Eg==',key_name='tempest-keypair-test-1271434047',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='930885dc59f549af8c627aca841fd798',ramdisk_id='',reservation_id='r-0g1lyhfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDbsMonitoringTest-1125627309',owner_user_name='tempest-OvnDbsMonitoringTest-1125627309-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:22:33Z,user_data=None,user_id='984261c8d6c2480eb7dcce1e8474cecf',uuid=b77705ce-8f65-4ffd-8131-b8526e3f84be,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "address": "fa:16:3e:1a:f9:ee", "network": {"id": "310f9139-4565-46ad-b618-cb4acd1f9f3c", "bridge": "br-int", "label": "tempest-test-network--1642810949", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "930885dc59f549af8c627aca841fd798", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb66cb7-24", "ovs_interfaceid": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.618 183087 DEBUG nova.network.os_vif_util [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Converting VIF {"id": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "address": "fa:16:3e:1a:f9:ee", "network": {"id": "310f9139-4565-46ad-b618-cb4acd1f9f3c", "bridge": "br-int", "label": "tempest-test-network--1642810949", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "930885dc59f549af8c627aca841fd798", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb66cb7-24", "ovs_interfaceid": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.618 183087 DEBUG nova.network.os_vif_util [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1a:f9:ee,bridge_name='br-int',has_traffic_filtering=True,id=aeb66cb7-241c-4d27-ba90-6748efb274ac,network=Network(310f9139-4565-46ad-b618-cb4acd1f9f3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaeb66cb7-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.619 183087 DEBUG nova.objects.instance [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lazy-loading 'pci_devices' on Instance uuid b77705ce-8f65-4ffd-8131-b8526e3f84be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.679 183087 DEBUG nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] End _get_guest_xml xml=<domain type="kvm">
Jan 26 09:22:40 compute-1 nova_compute[183083]:   <uuid>b77705ce-8f65-4ffd-8131-b8526e3f84be</uuid>
Jan 26 09:22:40 compute-1 nova_compute[183083]:   <name>instance-0000003d</name>
Jan 26 09:22:40 compute-1 nova_compute[183083]:   <memory>131072</memory>
Jan 26 09:22:40 compute-1 nova_compute[183083]:   <vcpu>1</vcpu>
Jan 26 09:22:40 compute-1 nova_compute[183083]:   <metadata>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <nova:name>tempest-server-test-1442720338</nova:name>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <nova:creationTime>2026-01-26 09:22:40</nova:creationTime>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <nova:flavor name="m1.nano">
Jan 26 09:22:40 compute-1 nova_compute[183083]:         <nova:memory>128</nova:memory>
Jan 26 09:22:40 compute-1 nova_compute[183083]:         <nova:disk>1</nova:disk>
Jan 26 09:22:40 compute-1 nova_compute[183083]:         <nova:swap>0</nova:swap>
Jan 26 09:22:40 compute-1 nova_compute[183083]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 09:22:40 compute-1 nova_compute[183083]:         <nova:vcpus>1</nova:vcpus>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       </nova:flavor>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <nova:owner>
Jan 26 09:22:40 compute-1 nova_compute[183083]:         <nova:user uuid="984261c8d6c2480eb7dcce1e8474cecf">tempest-OvnDbsMonitoringTest-1125627309-project-member</nova:user>
Jan 26 09:22:40 compute-1 nova_compute[183083]:         <nova:project uuid="930885dc59f549af8c627aca841fd798">tempest-OvnDbsMonitoringTest-1125627309</nova:project>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       </nova:owner>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <nova:root type="image" uuid="13d1a20a-8003-4f19-aba7-ccbd9eff9b82"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <nova:ports>
Jan 26 09:22:40 compute-1 nova_compute[183083]:         <nova:port uuid="aeb66cb7-241c-4d27-ba90-6748efb274ac">
Jan 26 09:22:40 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       </nova:ports>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     </nova:instance>
Jan 26 09:22:40 compute-1 nova_compute[183083]:   </metadata>
Jan 26 09:22:40 compute-1 nova_compute[183083]:   <sysinfo type="smbios">
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <system>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <entry name="manufacturer">RDO</entry>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <entry name="product">OpenStack Compute</entry>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <entry name="serial">b77705ce-8f65-4ffd-8131-b8526e3f84be</entry>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <entry name="uuid">b77705ce-8f65-4ffd-8131-b8526e3f84be</entry>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <entry name="family">Virtual Machine</entry>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     </system>
Jan 26 09:22:40 compute-1 nova_compute[183083]:   </sysinfo>
Jan 26 09:22:40 compute-1 nova_compute[183083]:   <os>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <boot dev="hd"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <smbios mode="sysinfo"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:   </os>
Jan 26 09:22:40 compute-1 nova_compute[183083]:   <features>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <acpi/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <apic/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <vmcoreinfo/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:   </features>
Jan 26 09:22:40 compute-1 nova_compute[183083]:   <clock offset="utc">
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <timer name="hpet" present="no"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:   </clock>
Jan 26 09:22:40 compute-1 nova_compute[183083]:   <cpu mode="host-model" match="exact">
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:   </cpu>
Jan 26 09:22:40 compute-1 nova_compute[183083]:   <devices>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <disk type="file" device="disk">
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/disk"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <target dev="vda" bus="virtio"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     </disk>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <disk type="file" device="cdrom">
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/disk.config"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <target dev="sda" bus="sata"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     </disk>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:1a:f9:ee"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <target dev="tapaeb66cb7-24"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     </interface>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <serial type="pty">
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <log file="/var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/console.log" append="off"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     </serial>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <video>
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     </video>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <input type="tablet" bus="usb"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <rng model="virtio">
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <backend model="random">/dev/urandom</backend>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     </rng>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <controller type="usb" index="0"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     <memballoon model="virtio">
Jan 26 09:22:40 compute-1 nova_compute[183083]:       <stats period="10"/>
Jan 26 09:22:40 compute-1 nova_compute[183083]:     </memballoon>
Jan 26 09:22:40 compute-1 nova_compute[183083]:   </devices>
Jan 26 09:22:40 compute-1 nova_compute[183083]: </domain>
Jan 26 09:22:40 compute-1 nova_compute[183083]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.680 183087 DEBUG nova.compute.manager [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Preparing to wait for external event network-vif-plugged-aeb66cb7-241c-4d27-ba90-6748efb274ac prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.681 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Acquiring lock "b77705ce-8f65-4ffd-8131-b8526e3f84be-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.681 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lock "b77705ce-8f65-4ffd-8131-b8526e3f84be-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.681 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lock "b77705ce-8f65-4ffd-8131-b8526e3f84be-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.682 183087 DEBUG nova.virt.libvirt.vif [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T09:22:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1442720338',display_name='tempest-server-test-1442720338',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1442720338',id=61,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ+Um6JtlVrQsgbR1BDAUGP54MrmURZbG6XORR8dNN8m2jKCrMlVoDJjy1qom71CGwRnSEvfSwUYH2J/WE90y+AM7LbCl2siZ6JwO9g4XvPTE7ptzSM3k34WQsNo2WW+Eg==',key_name='tempest-keypair-test-1271434047',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='930885dc59f549af8c627aca841fd798',ramdisk_id='',reservation_id='r-0g1lyhfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDbsMonitoringTest-1125627309',owner_user_name='tempest-OvnDbsMonitoringTest-1125627309-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:22:33Z,user_data=None,user_id='984261c8d6c2480eb7dcce1e8474cecf',uuid=b77705ce-8f65-4ffd-8131-b8526e3f84be,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "address": "fa:16:3e:1a:f9:ee", "network": {"id": "310f9139-4565-46ad-b618-cb4acd1f9f3c", "bridge": "br-int", "label": "tempest-test-network--1642810949", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "930885dc59f549af8c627aca841fd798", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb66cb7-24", "ovs_interfaceid": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.683 183087 DEBUG nova.network.os_vif_util [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Converting VIF {"id": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "address": "fa:16:3e:1a:f9:ee", "network": {"id": "310f9139-4565-46ad-b618-cb4acd1f9f3c", "bridge": "br-int", "label": "tempest-test-network--1642810949", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "930885dc59f549af8c627aca841fd798", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb66cb7-24", "ovs_interfaceid": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.683 183087 DEBUG nova.network.os_vif_util [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1a:f9:ee,bridge_name='br-int',has_traffic_filtering=True,id=aeb66cb7-241c-4d27-ba90-6748efb274ac,network=Network(310f9139-4565-46ad-b618-cb4acd1f9f3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaeb66cb7-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.684 183087 DEBUG os_vif [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:f9:ee,bridge_name='br-int',has_traffic_filtering=True,id=aeb66cb7-241c-4d27-ba90-6748efb274ac,network=Network(310f9139-4565-46ad-b618-cb4acd1f9f3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaeb66cb7-24') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.684 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.685 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.685 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.687 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.688 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaeb66cb7-24, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.688 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaeb66cb7-24, col_values=(('external_ids', {'iface-id': 'aeb66cb7-241c-4d27-ba90-6748efb274ac', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1a:f9:ee', 'vm-uuid': 'b77705ce-8f65-4ffd-8131-b8526e3f84be'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.741 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:40 compute-1 NetworkManager[55451]: <info>  [1769419360.7420] manager: (tapaeb66cb7-24): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.744 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.750 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.752 183087 INFO os_vif [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:f9:ee,bridge_name='br-int',has_traffic_filtering=True,id=aeb66cb7-241c-4d27-ba90-6748efb274ac,network=Network(310f9139-4565-46ad-b618-cb4acd1f9f3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaeb66cb7-24')
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.807 183087 DEBUG nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.807 183087 DEBUG nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.808 183087 DEBUG nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] No VIF found with MAC fa:16:3e:1a:f9:ee, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.808 183087 INFO nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Using config drive
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:22:40 compute-1 nova_compute[183083]: 2026-01-26 09:22:40.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 26 09:22:41 compute-1 nova_compute[183083]: 2026-01-26 09:22:41.478 183087 INFO nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Creating config drive at /var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/disk.config
Jan 26 09:22:41 compute-1 nova_compute[183083]: 2026-01-26 09:22:41.483 183087 DEBUG oslo_concurrency.processutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmf9x0y7g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:22:41 compute-1 nova_compute[183083]: 2026-01-26 09:22:41.614 183087 DEBUG oslo_concurrency.processutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmf9x0y7g" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:22:41 compute-1 kernel: tapaeb66cb7-24: entered promiscuous mode
Jan 26 09:22:41 compute-1 NetworkManager[55451]: <info>  [1769419361.6955] manager: (tapaeb66cb7-24): new Tun device (/org/freedesktop/NetworkManager/Devices/114)
Jan 26 09:22:41 compute-1 ovn_controller[95352]: 2026-01-26T09:22:41Z|00357|binding|INFO|Claiming lport aeb66cb7-241c-4d27-ba90-6748efb274ac for this chassis.
Jan 26 09:22:41 compute-1 nova_compute[183083]: 2026-01-26 09:22:41.697 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:41 compute-1 ovn_controller[95352]: 2026-01-26T09:22:41Z|00358|binding|INFO|aeb66cb7-241c-4d27-ba90-6748efb274ac: Claiming fa:16:3e:1a:f9:ee 10.100.0.14
Jan 26 09:22:41 compute-1 nova_compute[183083]: 2026-01-26 09:22:41.701 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.709 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1a:f9:ee 10.100.0.14'], port_security=['fa:16:3e:1a:f9:ee 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b77705ce-8f65-4ffd-8131-b8526e3f84be', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-310f9139-4565-46ad-b618-cb4acd1f9f3c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '930885dc59f549af8c627aca841fd798', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ff8e018e-900b-48af-95b1-b6806534a763', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=39936771-7c60-4fc8-9420-d512af5f578a, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=aeb66cb7-241c-4d27-ba90-6748efb274ac) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.710 104632 INFO neutron.agent.ovn.metadata.agent [-] Port aeb66cb7-241c-4d27-ba90-6748efb274ac in datapath 310f9139-4565-46ad-b618-cb4acd1f9f3c bound to our chassis
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.710 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 310f9139-4565-46ad-b618-cb4acd1f9f3c
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.730 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[15071b3c-fee5-409f-ab5b-18d653fdbbe5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.731 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap310f9139-41 in ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.735 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap310f9139-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.735 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[08d232b1-ee28-4f45-aac8-32a04e9badf2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:22:41 compute-1 systemd-udevd[229899]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.736 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[a8985c44-4a7d-419c-b3ce-a5e82e3ee48f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:22:41 compute-1 nova_compute[183083]: 2026-01-26 09:22:41.740 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:41 compute-1 systemd-machined[154360]: New machine qemu-21-instance-0000003d.
Jan 26 09:22:41 compute-1 ovn_controller[95352]: 2026-01-26T09:22:41Z|00359|binding|INFO|Setting lport aeb66cb7-241c-4d27-ba90-6748efb274ac ovn-installed in OVS
Jan 26 09:22:41 compute-1 ovn_controller[95352]: 2026-01-26T09:22:41Z|00360|binding|INFO|Setting lport aeb66cb7-241c-4d27-ba90-6748efb274ac up in Southbound
Jan 26 09:22:41 compute-1 nova_compute[183083]: 2026-01-26 09:22:41.745 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:41 compute-1 NetworkManager[55451]: <info>  [1769419361.7552] device (tapaeb66cb7-24): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.754 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[e07af32b-58f8-44b6-8cef-d89f5c669513]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:22:41 compute-1 NetworkManager[55451]: <info>  [1769419361.7569] device (tapaeb66cb7-24): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 09:22:41 compute-1 systemd[1]: Started Virtual Machine qemu-21-instance-0000003d.
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.770 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[38609274-8660-4c35-b4c2-4cfbc5122b2f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.803 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[3be5869b-e3db-4e8f-999a-1c4911d369fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:22:41 compute-1 NetworkManager[55451]: <info>  [1769419361.8101] manager: (tap310f9139-40): new Veth device (/org/freedesktop/NetworkManager/Devices/115)
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.809 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[69a14336-11d4-43eb-8e5f-048774cad98d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:22:41 compute-1 systemd-udevd[229902]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.843 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[5adc8bf6-a186-4027-a598-76899a634215]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.848 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[d9090851-89a6-4975-afc7-d28affee5d0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:22:41 compute-1 NetworkManager[55451]: <info>  [1769419361.8731] device (tap310f9139-40): carrier: link connected
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.877 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb06fa9-cf32-4891-91bf-2917f37fe1d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.894 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[91146b34-8bca-4b31-b7d3-7b46636b5745]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap310f9139-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:72:61'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570247, 'reachable_time': 29727, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229931, 'error': None, 'target': 'ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.912 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[6d59d6c2-0a70-4176-b6fc-f245db3ff358]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9c:7261'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 570247, 'tstamp': 570247}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229932, 'error': None, 'target': 'ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.932 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[5f595ccc-1f1c-45c2-9f62-3c0cfe12e563]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap310f9139-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:72:61'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570247, 'reachable_time': 29727, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229933, 'error': None, 'target': 'ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:22:41 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:41.973 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[359493cd-8880-422b-9fbc-76699df88b54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:42.050 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[457ecb32-0959-4dc7-a770-3b008df2563d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:42.052 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap310f9139-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:42.053 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:42.054 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap310f9139-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:22:42 compute-1 nova_compute[183083]: 2026-01-26 09:22:42.056 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:42 compute-1 NetworkManager[55451]: <info>  [1769419362.0569] manager: (tap310f9139-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Jan 26 09:22:42 compute-1 kernel: tap310f9139-40: entered promiscuous mode
Jan 26 09:22:42 compute-1 nova_compute[183083]: 2026-01-26 09:22:42.062 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:42.062 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap310f9139-40, col_values=(('external_ids', {'iface-id': '0cfd3107-192b-4bf1-bd6b-b6c39ad8e38a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:22:42 compute-1 ovn_controller[95352]: 2026-01-26T09:22:42Z|00361|binding|INFO|Releasing lport 0cfd3107-192b-4bf1-bd6b-b6c39ad8e38a from this chassis (sb_readonly=0)
Jan 26 09:22:42 compute-1 nova_compute[183083]: 2026-01-26 09:22:42.076 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:42.077 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/310f9139-4565-46ad-b618-cb4acd1f9f3c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/310f9139-4565-46ad-b618-cb4acd1f9f3c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:42.078 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b12d6308-2747-4a64-a015-89e0474c51b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:42.080 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]: global
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-310f9139-4565-46ad-b618-cb4acd1f9f3c
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/310f9139-4565-46ad-b618-cb4acd1f9f3c.pid.haproxy
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID 310f9139-4565-46ad-b618-cb4acd1f9f3c
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 09:22:42 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:22:42.081 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c', 'env', 'PROCESS_TAG=haproxy-310f9139-4565-46ad-b618-cb4acd1f9f3c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/310f9139-4565-46ad-b618-cb4acd1f9f3c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 09:22:42 compute-1 podman[229965]: 2026-01-26 09:22:42.475968849 +0000 UTC m=+0.063434143 container create 8a7456bc70b58a65f6d5185da7574be902787e56d6b0f0c07f35b4708dcde5f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:22:42 compute-1 systemd[1]: Started libpod-conmon-8a7456bc70b58a65f6d5185da7574be902787e56d6b0f0c07f35b4708dcde5f3.scope.
Jan 26 09:22:42 compute-1 podman[229965]: 2026-01-26 09:22:42.441124695 +0000 UTC m=+0.028590009 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 09:22:42 compute-1 nova_compute[183083]: 2026-01-26 09:22:42.540 183087 DEBUG nova.compute.manager [req-b708dd34-3654-4afe-b531-250856c1bc94 req-ae4c57d7-46ed-49bd-995b-d1ddf0536f23 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Received event network-vif-plugged-aeb66cb7-241c-4d27-ba90-6748efb274ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:22:42 compute-1 nova_compute[183083]: 2026-01-26 09:22:42.541 183087 DEBUG oslo_concurrency.lockutils [req-b708dd34-3654-4afe-b531-250856c1bc94 req-ae4c57d7-46ed-49bd-995b-d1ddf0536f23 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "b77705ce-8f65-4ffd-8131-b8526e3f84be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:22:42 compute-1 nova_compute[183083]: 2026-01-26 09:22:42.541 183087 DEBUG oslo_concurrency.lockutils [req-b708dd34-3654-4afe-b531-250856c1bc94 req-ae4c57d7-46ed-49bd-995b-d1ddf0536f23 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "b77705ce-8f65-4ffd-8131-b8526e3f84be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:22:42 compute-1 nova_compute[183083]: 2026-01-26 09:22:42.541 183087 DEBUG oslo_concurrency.lockutils [req-b708dd34-3654-4afe-b531-250856c1bc94 req-ae4c57d7-46ed-49bd-995b-d1ddf0536f23 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "b77705ce-8f65-4ffd-8131-b8526e3f84be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:22:42 compute-1 nova_compute[183083]: 2026-01-26 09:22:42.542 183087 DEBUG nova.compute.manager [req-b708dd34-3654-4afe-b531-250856c1bc94 req-ae4c57d7-46ed-49bd-995b-d1ddf0536f23 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Processing event network-vif-plugged-aeb66cb7-241c-4d27-ba90-6748efb274ac _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 09:22:42 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:22:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4951b92474dea6adfad017fd07d8a71f94cd918815599455f358d9f1a7981a3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 09:22:42 compute-1 podman[229965]: 2026-01-26 09:22:42.588946129 +0000 UTC m=+0.176411433 container init 8a7456bc70b58a65f6d5185da7574be902787e56d6b0f0c07f35b4708dcde5f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 09:22:42 compute-1 podman[229965]: 2026-01-26 09:22:42.595557956 +0000 UTC m=+0.183023240 container start 8a7456bc70b58a65f6d5185da7574be902787e56d6b0f0c07f35b4708dcde5f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 26 09:22:42 compute-1 neutron-haproxy-ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c[229980]: [NOTICE]   (229984) : New worker (229986) forked
Jan 26 09:22:42 compute-1 neutron-haproxy-ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c[229980]: [NOTICE]   (229984) : Loading success.
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.053 183087 DEBUG nova.compute.manager [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.054 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769419363.0525389, b77705ce-8f65-4ffd-8131-b8526e3f84be => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.055 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] VM Started (Lifecycle Event)
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.061 183087 DEBUG nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.066 183087 INFO nova.virt.libvirt.driver [-] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Instance spawned successfully.
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.067 183087 DEBUG nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.075 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.079 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.089 183087 DEBUG nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.089 183087 DEBUG nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.089 183087 DEBUG nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.090 183087 DEBUG nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.090 183087 DEBUG nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.090 183087 DEBUG nova.virt.libvirt.driver [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.096 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.096 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769419363.0527525, b77705ce-8f65-4ffd-8131-b8526e3f84be => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.096 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] VM Paused (Lifecycle Event)
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.120 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.122 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769419363.0599265, b77705ce-8f65-4ffd-8131-b8526e3f84be => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.123 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] VM Resumed (Lifecycle Event)
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.152 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.157 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.163 183087 INFO nova.compute.manager [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Took 10.09 seconds to spawn the instance on the hypervisor.
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.163 183087 DEBUG nova.compute.manager [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.186 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.225 183087 INFO nova.compute.manager [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Took 10.63 seconds to build instance.
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.251 183087 DEBUG oslo_concurrency.lockutils [None req-6c293b79-2daf-4452-b611-65fcfce73f7a 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lock "b77705ce-8f65-4ffd-8131-b8526e3f84be" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.850s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.507 183087 DEBUG nova.network.neutron [req-6c300faa-da1d-4ec6-a11c-787bf99eb4e6 req-65d99f6e-226b-4329-b5c0-41c3d1563ce4 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Updated VIF entry in instance network info cache for port aeb66cb7-241c-4d27-ba90-6748efb274ac. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.508 183087 DEBUG nova.network.neutron [req-6c300faa-da1d-4ec6-a11c-787bf99eb4e6 req-65d99f6e-226b-4329-b5c0-41c3d1563ce4 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Updating instance_info_cache with network_info: [{"id": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "address": "fa:16:3e:1a:f9:ee", "network": {"id": "310f9139-4565-46ad-b618-cb4acd1f9f3c", "bridge": "br-int", "label": "tempest-test-network--1642810949", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "930885dc59f549af8c627aca841fd798", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb66cb7-24", "ovs_interfaceid": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.532 183087 DEBUG oslo_concurrency.lockutils [req-6c300faa-da1d-4ec6-a11c-787bf99eb4e6 req-65d99f6e-226b-4329-b5c0-41c3d1563ce4 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-b77705ce-8f65-4ffd-8131-b8526e3f84be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.890 183087 INFO nova.compute.manager [None req-5aaa5e9b-bd9a-40bb-baca-d130c683afd3 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Get console output
Jan 26 09:22:43 compute-1 nova_compute[183083]: 2026-01-26 09:22:43.896 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 09:22:44 compute-1 nova_compute[183083]: 2026-01-26 09:22:44.311 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:44 compute-1 nova_compute[183083]: 2026-01-26 09:22:44.610 183087 DEBUG nova.compute.manager [req-4cb588b1-a2b8-44cc-b2e7-44cbd3ce2431 req-82651a8e-8b51-4e09-9159-e5ca06ac7e91 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Received event network-vif-plugged-aeb66cb7-241c-4d27-ba90-6748efb274ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:22:44 compute-1 nova_compute[183083]: 2026-01-26 09:22:44.611 183087 DEBUG oslo_concurrency.lockutils [req-4cb588b1-a2b8-44cc-b2e7-44cbd3ce2431 req-82651a8e-8b51-4e09-9159-e5ca06ac7e91 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "b77705ce-8f65-4ffd-8131-b8526e3f84be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:22:44 compute-1 nova_compute[183083]: 2026-01-26 09:22:44.611 183087 DEBUG oslo_concurrency.lockutils [req-4cb588b1-a2b8-44cc-b2e7-44cbd3ce2431 req-82651a8e-8b51-4e09-9159-e5ca06ac7e91 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "b77705ce-8f65-4ffd-8131-b8526e3f84be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:22:44 compute-1 nova_compute[183083]: 2026-01-26 09:22:44.612 183087 DEBUG oslo_concurrency.lockutils [req-4cb588b1-a2b8-44cc-b2e7-44cbd3ce2431 req-82651a8e-8b51-4e09-9159-e5ca06ac7e91 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "b77705ce-8f65-4ffd-8131-b8526e3f84be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:22:44 compute-1 nova_compute[183083]: 2026-01-26 09:22:44.612 183087 DEBUG nova.compute.manager [req-4cb588b1-a2b8-44cc-b2e7-44cbd3ce2431 req-82651a8e-8b51-4e09-9159-e5ca06ac7e91 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] No waiting events found dispatching network-vif-plugged-aeb66cb7-241c-4d27-ba90-6748efb274ac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:22:44 compute-1 nova_compute[183083]: 2026-01-26 09:22:44.612 183087 WARNING nova.compute.manager [req-4cb588b1-a2b8-44cc-b2e7-44cbd3ce2431 req-82651a8e-8b51-4e09-9159-e5ca06ac7e91 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Received unexpected event network-vif-plugged-aeb66cb7-241c-4d27-ba90-6748efb274ac for instance with vm_state active and task_state None.
Jan 26 09:22:45 compute-1 nova_compute[183083]: 2026-01-26 09:22:45.742 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:46 compute-1 nova_compute[183083]: 2026-01-26 09:22:46.979 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.003 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.003 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.003 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.004 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.085 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.148 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.150 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.247 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.425 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.427 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13479MB free_disk=113.0828971862793GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.427 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.428 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.583 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance b77705ce-8f65-4ffd-8131-b8526e3f84be actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.584 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.585 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.782 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.804 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.838 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:22:47 compute-1 nova_compute[183083]: 2026-01-26 09:22:47.839 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.411s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:22:49 compute-1 nova_compute[183083]: 2026-01-26 09:22:49.314 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:49 compute-1 nova_compute[183083]: 2026-01-26 09:22:49.358 183087 INFO nova.compute.manager [None req-197823f7-f718-4ed3-8012-a0813776938b 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Get console output
Jan 26 09:22:50 compute-1 nova_compute[183083]: 2026-01-26 09:22:50.792 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:51 compute-1 sshd-session[230009]: Connection closed by authenticating user root 178.62.249.31 port 58736 [preauth]
Jan 26 09:22:53 compute-1 podman[230015]: 2026-01-26 09:22:53.82199576 +0000 UTC m=+0.073215628 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 26 09:22:53 compute-1 podman[230016]: 2026-01-26 09:22:53.829952665 +0000 UTC m=+0.072081896 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:22:53 compute-1 podman[230013]: 2026-01-26 09:22:53.829986846 +0000 UTC m=+0.087404939 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 09:22:53 compute-1 podman[230014]: 2026-01-26 09:22:53.837525219 +0000 UTC m=+0.082713227 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, io.openshift.tags=minimal rhel9)
Jan 26 09:22:53 compute-1 podman[230012]: 2026-01-26 09:22:53.86447269 +0000 UTC m=+0.123924741 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 09:22:54 compute-1 nova_compute[183083]: 2026-01-26 09:22:54.318 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:54 compute-1 nova_compute[183083]: 2026-01-26 09:22:54.570 183087 INFO nova.compute.manager [None req-7d7bc9ac-e6a4-4abe-93df-4014c1881c6b 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Get console output
Jan 26 09:22:54 compute-1 ovn_controller[95352]: 2026-01-26T09:22:54Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1a:f9:ee 10.100.0.14
Jan 26 09:22:54 compute-1 ovn_controller[95352]: 2026-01-26T09:22:54Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1a:f9:ee 10.100.0.14
Jan 26 09:22:55 compute-1 nova_compute[183083]: 2026-01-26 09:22:55.794 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:55 compute-1 nova_compute[183083]: 2026-01-26 09:22:55.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:22:59 compute-1 nova_compute[183083]: 2026-01-26 09:22:59.320 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:22:59 compute-1 nova_compute[183083]: 2026-01-26 09:22:59.706 183087 INFO nova.compute.manager [None req-b6eb94ce-3a06-4375-b041-cab377542729 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Get console output
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.141 183087 DEBUG oslo_concurrency.lockutils [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Acquiring lock "b77705ce-8f65-4ffd-8131-b8526e3f84be" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.142 183087 DEBUG oslo_concurrency.lockutils [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lock "b77705ce-8f65-4ffd-8131-b8526e3f84be" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.142 183087 DEBUG oslo_concurrency.lockutils [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Acquiring lock "b77705ce-8f65-4ffd-8131-b8526e3f84be-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.143 183087 DEBUG oslo_concurrency.lockutils [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lock "b77705ce-8f65-4ffd-8131-b8526e3f84be-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.144 183087 DEBUG oslo_concurrency.lockutils [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lock "b77705ce-8f65-4ffd-8131-b8526e3f84be-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.146 183087 INFO nova.compute.manager [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Terminating instance
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.148 183087 DEBUG nova.compute.manager [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 09:23:00 compute-1 kernel: tapaeb66cb7-24 (unregistering): left promiscuous mode
Jan 26 09:23:00 compute-1 NetworkManager[55451]: <info>  [1769419380.1734] device (tapaeb66cb7-24): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.184 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:00 compute-1 ovn_controller[95352]: 2026-01-26T09:23:00Z|00362|binding|INFO|Releasing lport aeb66cb7-241c-4d27-ba90-6748efb274ac from this chassis (sb_readonly=0)
Jan 26 09:23:00 compute-1 ovn_controller[95352]: 2026-01-26T09:23:00Z|00363|binding|INFO|Setting lport aeb66cb7-241c-4d27-ba90-6748efb274ac down in Southbound
Jan 26 09:23:00 compute-1 ovn_controller[95352]: 2026-01-26T09:23:00Z|00364|binding|INFO|Removing iface tapaeb66cb7-24 ovn-installed in OVS
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.186 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.208 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:00.213 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1a:f9:ee 10.100.0.14'], port_security=['fa:16:3e:1a:f9:ee 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b77705ce-8f65-4ffd-8131-b8526e3f84be', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-310f9139-4565-46ad-b618-cb4acd1f9f3c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '930885dc59f549af8c627aca841fd798', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ff8e018e-900b-48af-95b1-b6806534a763', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=39936771-7c60-4fc8-9420-d512af5f578a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=aeb66cb7-241c-4d27-ba90-6748efb274ac) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:23:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:00.214 104632 INFO neutron.agent.ovn.metadata.agent [-] Port aeb66cb7-241c-4d27-ba90-6748efb274ac in datapath 310f9139-4565-46ad-b618-cb4acd1f9f3c unbound from our chassis
Jan 26 09:23:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:00.215 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 310f9139-4565-46ad-b618-cb4acd1f9f3c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 09:23:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:00.216 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[72e0c288-792f-4932-b8b5-a79948135492]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:23:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:00.217 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c namespace which is not needed anymore
Jan 26 09:23:00 compute-1 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000003d.scope: Deactivated successfully.
Jan 26 09:23:00 compute-1 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000003d.scope: Consumed 13.288s CPU time.
Jan 26 09:23:00 compute-1 systemd-machined[154360]: Machine qemu-21-instance-0000003d terminated.
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.412 183087 INFO nova.virt.libvirt.driver [-] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Instance destroyed successfully.
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.412 183087 DEBUG nova.objects.instance [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lazy-loading 'resources' on Instance uuid b77705ce-8f65-4ffd-8131-b8526e3f84be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:23:00 compute-1 neutron-haproxy-ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c[229980]: [NOTICE]   (229984) : haproxy version is 2.8.14-c23fe91
Jan 26 09:23:00 compute-1 neutron-haproxy-ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c[229980]: [NOTICE]   (229984) : path to executable is /usr/sbin/haproxy
Jan 26 09:23:00 compute-1 neutron-haproxy-ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c[229980]: [WARNING]  (229984) : Exiting Master process...
Jan 26 09:23:00 compute-1 neutron-haproxy-ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c[229980]: [WARNING]  (229984) : Exiting Master process...
Jan 26 09:23:00 compute-1 neutron-haproxy-ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c[229980]: [ALERT]    (229984) : Current worker (229986) exited with code 143 (Terminated)
Jan 26 09:23:00 compute-1 neutron-haproxy-ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c[229980]: [WARNING]  (229984) : All workers exited. Exiting... (0)
Jan 26 09:23:00 compute-1 systemd[1]: libpod-8a7456bc70b58a65f6d5185da7574be902787e56d6b0f0c07f35b4708dcde5f3.scope: Deactivated successfully.
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.431 183087 DEBUG nova.virt.libvirt.vif [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T09:22:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1442720338',display_name='tempest-server-test-1442720338',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1442720338',id=61,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ+Um6JtlVrQsgbR1BDAUGP54MrmURZbG6XORR8dNN8m2jKCrMlVoDJjy1qom71CGwRnSEvfSwUYH2J/WE90y+AM7LbCl2siZ6JwO9g4XvPTE7ptzSM3k34WQsNo2WW+Eg==',key_name='tempest-keypair-test-1271434047',keypairs=<?>,launch_index=0,launched_at=2026-01-26T09:22:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='930885dc59f549af8c627aca841fd798',ramdisk_id='',reservation_id='r-0g1lyhfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDbsMonitoringTest-1125627309',owner_user_name='tempest-OvnDbsMonitoringTest-1125627309-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T09:22:43Z,user_data=None,user_id='984261c8d6c2480eb7dcce1e8474cecf',uuid=b77705ce-8f65-4ffd-8131-b8526e3f84be,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "address": "fa:16:3e:1a:f9:ee", "network": {"id": "310f9139-4565-46ad-b618-cb4acd1f9f3c", "bridge": "br-int", "label": "tempest-test-network--1642810949", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "930885dc59f549af8c627aca841fd798", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb66cb7-24", "ovs_interfaceid": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.431 183087 DEBUG nova.network.os_vif_util [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Converting VIF {"id": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "address": "fa:16:3e:1a:f9:ee", "network": {"id": "310f9139-4565-46ad-b618-cb4acd1f9f3c", "bridge": "br-int", "label": "tempest-test-network--1642810949", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "930885dc59f549af8c627aca841fd798", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb66cb7-24", "ovs_interfaceid": "aeb66cb7-241c-4d27-ba90-6748efb274ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.432 183087 DEBUG nova.network.os_vif_util [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1a:f9:ee,bridge_name='br-int',has_traffic_filtering=True,id=aeb66cb7-241c-4d27-ba90-6748efb274ac,network=Network(310f9139-4565-46ad-b618-cb4acd1f9f3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaeb66cb7-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.432 183087 DEBUG os_vif [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:f9:ee,bridge_name='br-int',has_traffic_filtering=True,id=aeb66cb7-241c-4d27-ba90-6748efb274ac,network=Network(310f9139-4565-46ad-b618-cb4acd1f9f3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaeb66cb7-24') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.434 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.434 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaeb66cb7-24, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.435 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:00 compute-1 podman[230149]: 2026-01-26 09:23:00.43689778 +0000 UTC m=+0.117203210 container died 8a7456bc70b58a65f6d5185da7574be902787e56d6b0f0c07f35b4708dcde5f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.436 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.438 183087 INFO os_vif [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:f9:ee,bridge_name='br-int',has_traffic_filtering=True,id=aeb66cb7-241c-4d27-ba90-6748efb274ac,network=Network(310f9139-4565-46ad-b618-cb4acd1f9f3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaeb66cb7-24')
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.439 183087 INFO nova.virt.libvirt.driver [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Deleting instance files /var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be_del
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.439 183087 INFO nova.virt.libvirt.driver [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Deletion of /var/lib/nova/instances/b77705ce-8f65-4ffd-8131-b8526e3f84be_del complete
Jan 26 09:23:00 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8a7456bc70b58a65f6d5185da7574be902787e56d6b0f0c07f35b4708dcde5f3-userdata-shm.mount: Deactivated successfully.
Jan 26 09:23:00 compute-1 systemd[1]: var-lib-containers-storage-overlay-c4951b92474dea6adfad017fd07d8a71f94cd918815599455f358d9f1a7981a3-merged.mount: Deactivated successfully.
Jan 26 09:23:00 compute-1 podman[230149]: 2026-01-26 09:23:00.56576122 +0000 UTC m=+0.246066640 container cleanup 8a7456bc70b58a65f6d5185da7574be902787e56d6b0f0c07f35b4708dcde5f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.566 183087 INFO nova.compute.manager [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.566 183087 DEBUG oslo.service.loopingcall [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.566 183087 DEBUG nova.compute.manager [-] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.567 183087 DEBUG nova.network.neutron [-] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 09:23:00 compute-1 systemd[1]: libpod-conmon-8a7456bc70b58a65f6d5185da7574be902787e56d6b0f0c07f35b4708dcde5f3.scope: Deactivated successfully.
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.702 183087 DEBUG nova.compute.manager [req-1de8ab85-26ef-4077-a3d2-554db0c1da5a req-220e3dcc-c8c1-40a1-adc6-466a0a3688ca 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Received event network-vif-unplugged-aeb66cb7-241c-4d27-ba90-6748efb274ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.702 183087 DEBUG oslo_concurrency.lockutils [req-1de8ab85-26ef-4077-a3d2-554db0c1da5a req-220e3dcc-c8c1-40a1-adc6-466a0a3688ca 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "b77705ce-8f65-4ffd-8131-b8526e3f84be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.703 183087 DEBUG oslo_concurrency.lockutils [req-1de8ab85-26ef-4077-a3d2-554db0c1da5a req-220e3dcc-c8c1-40a1-adc6-466a0a3688ca 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "b77705ce-8f65-4ffd-8131-b8526e3f84be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.703 183087 DEBUG oslo_concurrency.lockutils [req-1de8ab85-26ef-4077-a3d2-554db0c1da5a req-220e3dcc-c8c1-40a1-adc6-466a0a3688ca 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "b77705ce-8f65-4ffd-8131-b8526e3f84be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.703 183087 DEBUG nova.compute.manager [req-1de8ab85-26ef-4077-a3d2-554db0c1da5a req-220e3dcc-c8c1-40a1-adc6-466a0a3688ca 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] No waiting events found dispatching network-vif-unplugged-aeb66cb7-241c-4d27-ba90-6748efb274ac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.704 183087 DEBUG nova.compute.manager [req-1de8ab85-26ef-4077-a3d2-554db0c1da5a req-220e3dcc-c8c1-40a1-adc6-466a0a3688ca 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Received event network-vif-unplugged-aeb66cb7-241c-4d27-ba90-6748efb274ac for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 09:23:00 compute-1 podman[230196]: 2026-01-26 09:23:00.721907519 +0000 UTC m=+0.134177830 container remove 8a7456bc70b58a65f6d5185da7574be902787e56d6b0f0c07f35b4708dcde5f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 26 09:23:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:00.727 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[aa308ddb-7842-4c52-bc08-656b906c7918]: (4, ('Mon Jan 26 09:23:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c (8a7456bc70b58a65f6d5185da7574be902787e56d6b0f0c07f35b4708dcde5f3)\n8a7456bc70b58a65f6d5185da7574be902787e56d6b0f0c07f35b4708dcde5f3\nMon Jan 26 09:23:00 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c (8a7456bc70b58a65f6d5185da7574be902787e56d6b0f0c07f35b4708dcde5f3)\n8a7456bc70b58a65f6d5185da7574be902787e56d6b0f0c07f35b4708dcde5f3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:23:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:00.729 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[f287294a-b265-436b-b592-17dc81f4210a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:23:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:00.731 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap310f9139-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:23:00 compute-1 kernel: tap310f9139-40: left promiscuous mode
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.736 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:00 compute-1 nova_compute[183083]: 2026-01-26 09:23:00.747 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:00.749 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[1d7cde9b-4984-4654-b1ac-9454e341ce77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:23:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:00.765 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[6fe1438d-b7fc-4f6d-8ac5-55682c032474]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:23:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:00.767 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[90f0dd65-9b07-484d-ae22-fee7025e798e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:23:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:00.785 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[a2167f11-9b98-4f03-b9d4-7d6d1ee30354]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570240, 'reachable_time': 42733, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230211, 'error': None, 'target': 'ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:23:00 compute-1 systemd[1]: run-netns-ovnmeta\x2d310f9139\x2d4565\x2d46ad\x2db618\x2dcb4acd1f9f3c.mount: Deactivated successfully.
Jan 26 09:23:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:00.789 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-310f9139-4565-46ad-b618-cb4acd1f9f3c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 09:23:00 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:00.790 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[35d25a7c-66b9-459e-b594-7dad67175f8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:23:01 compute-1 nova_compute[183083]: 2026-01-26 09:23:01.456 183087 DEBUG nova.network.neutron [-] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:23:01 compute-1 nova_compute[183083]: 2026-01-26 09:23:01.481 183087 INFO nova.compute.manager [-] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Took 0.91 seconds to deallocate network for instance.
Jan 26 09:23:01 compute-1 anacron[29975]: Job `cron.monthly' started
Jan 26 09:23:01 compute-1 anacron[29975]: Job `cron.monthly' terminated
Jan 26 09:23:01 compute-1 anacron[29975]: Normal exit (3 jobs run)
Jan 26 09:23:01 compute-1 nova_compute[183083]: 2026-01-26 09:23:01.639 183087 DEBUG nova.compute.manager [req-677c98ab-c187-4fdc-9f8c-5d38484b70c0 req-4b87ce90-6f39-4de5-9479-937fab8f995c 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Received event network-vif-deleted-aeb66cb7-241c-4d27-ba90-6748efb274ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:23:01 compute-1 nova_compute[183083]: 2026-01-26 09:23:01.641 183087 DEBUG oslo_concurrency.lockutils [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:23:01 compute-1 nova_compute[183083]: 2026-01-26 09:23:01.641 183087 DEBUG oslo_concurrency.lockutils [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:23:01 compute-1 nova_compute[183083]: 2026-01-26 09:23:01.717 183087 DEBUG nova.compute.provider_tree [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:23:01 compute-1 nova_compute[183083]: 2026-01-26 09:23:01.733 183087 DEBUG nova.scheduler.client.report [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:23:01 compute-1 nova_compute[183083]: 2026-01-26 09:23:01.766 183087 DEBUG oslo_concurrency.lockutils [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:23:01 compute-1 nova_compute[183083]: 2026-01-26 09:23:01.803 183087 INFO nova.scheduler.client.report [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Deleted allocations for instance b77705ce-8f65-4ffd-8131-b8526e3f84be
Jan 26 09:23:01 compute-1 nova_compute[183083]: 2026-01-26 09:23:01.888 183087 DEBUG oslo_concurrency.lockutils [None req-a4e407bc-7e26-4e13-bf91-b277b513e92d 984261c8d6c2480eb7dcce1e8474cecf 930885dc59f549af8c627aca841fd798 - - default default] Lock "b77705ce-8f65-4ffd-8131-b8526e3f84be" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.746s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:23:02 compute-1 nova_compute[183083]: 2026-01-26 09:23:02.796 183087 DEBUG nova.compute.manager [req-d2a8a550-ea4a-4245-8ab8-b3146b6e7b91 req-00f5a51a-d9fd-4be5-b42b-f21a756d691e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Received event network-vif-plugged-aeb66cb7-241c-4d27-ba90-6748efb274ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:23:02 compute-1 nova_compute[183083]: 2026-01-26 09:23:02.797 183087 DEBUG oslo_concurrency.lockutils [req-d2a8a550-ea4a-4245-8ab8-b3146b6e7b91 req-00f5a51a-d9fd-4be5-b42b-f21a756d691e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "b77705ce-8f65-4ffd-8131-b8526e3f84be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:23:02 compute-1 nova_compute[183083]: 2026-01-26 09:23:02.797 183087 DEBUG oslo_concurrency.lockutils [req-d2a8a550-ea4a-4245-8ab8-b3146b6e7b91 req-00f5a51a-d9fd-4be5-b42b-f21a756d691e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "b77705ce-8f65-4ffd-8131-b8526e3f84be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:23:02 compute-1 nova_compute[183083]: 2026-01-26 09:23:02.797 183087 DEBUG oslo_concurrency.lockutils [req-d2a8a550-ea4a-4245-8ab8-b3146b6e7b91 req-00f5a51a-d9fd-4be5-b42b-f21a756d691e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "b77705ce-8f65-4ffd-8131-b8526e3f84be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:23:02 compute-1 nova_compute[183083]: 2026-01-26 09:23:02.798 183087 DEBUG nova.compute.manager [req-d2a8a550-ea4a-4245-8ab8-b3146b6e7b91 req-00f5a51a-d9fd-4be5-b42b-f21a756d691e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] No waiting events found dispatching network-vif-plugged-aeb66cb7-241c-4d27-ba90-6748efb274ac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:23:02 compute-1 nova_compute[183083]: 2026-01-26 09:23:02.798 183087 WARNING nova.compute.manager [req-d2a8a550-ea4a-4245-8ab8-b3146b6e7b91 req-00f5a51a-d9fd-4be5-b42b-f21a756d691e 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Received unexpected event network-vif-plugged-aeb66cb7-241c-4d27-ba90-6748efb274ac for instance with vm_state deleted and task_state None.
Jan 26 09:23:04 compute-1 ovn_controller[95352]: 2026-01-26T09:23:04Z|00365|pinctrl|WARN|Dropped 299 log messages in last 59 seconds (most recently, 2 seconds ago) due to excessive rate
Jan 26 09:23:04 compute-1 ovn_controller[95352]: 2026-01-26T09:23:04Z|00366|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:23:04 compute-1 nova_compute[183083]: 2026-01-26 09:23:04.322 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:04 compute-1 nova_compute[183083]: 2026-01-26 09:23:04.577 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:05.341 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:23:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:05.342 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:23:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:05.342 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:23:05 compute-1 nova_compute[183083]: 2026-01-26 09:23:05.437 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:07 compute-1 podman[230214]: 2026-01-26 09:23:07.782798244 +0000 UTC m=+0.052397050 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 09:23:09 compute-1 nova_compute[183083]: 2026-01-26 09:23:09.325 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:10 compute-1 nova_compute[183083]: 2026-01-26 09:23:10.481 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:14 compute-1 nova_compute[183083]: 2026-01-26 09:23:14.327 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:15 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:15.404 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:23:15 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:15.405 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:23:15 compute-1 nova_compute[183083]: 2026-01-26 09:23:15.405 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:15 compute-1 nova_compute[183083]: 2026-01-26 09:23:15.411 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769419380.410347, b77705ce-8f65-4ffd-8131-b8526e3f84be => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:23:15 compute-1 nova_compute[183083]: 2026-01-26 09:23:15.411 183087 INFO nova.compute.manager [-] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] VM Stopped (Lifecycle Event)
Jan 26 09:23:15 compute-1 nova_compute[183083]: 2026-01-26 09:23:15.439 183087 DEBUG nova.compute.manager [None req-ff82373f-2907-4b92-bc3f-282b9c58e02a - - - - - -] [instance: b77705ce-8f65-4ffd-8131-b8526e3f84be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:23:15 compute-1 nova_compute[183083]: 2026-01-26 09:23:15.483 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:19 compute-1 nova_compute[183083]: 2026-01-26 09:23:19.330 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:20 compute-1 nova_compute[183083]: 2026-01-26 09:23:20.485 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:24 compute-1 nova_compute[183083]: 2026-01-26 09:23:24.331 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:24 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:23:24.407 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:23:24 compute-1 sshd-session[230239]: Accepted publickey for zuul from 38.102.83.66 port 51010 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:23:24 compute-1 systemd-logind[788]: New session 142 of user zuul.
Jan 26 09:23:24 compute-1 systemd[1]: Started Session 142 of User zuul.
Jan 26 09:23:24 compute-1 podman[230244]: 2026-01-26 09:23:24.838904353 +0000 UTC m=+0.071402637 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 26 09:23:24 compute-1 sshd-session[230239]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:23:24 compute-1 podman[230243]: 2026-01-26 09:23:24.870971979 +0000 UTC m=+0.103396481 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, vendor=Red Hat, Inc., name=ubi9-minimal, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 26 09:23:24 compute-1 podman[230242]: 2026-01-26 09:23:24.87347952 +0000 UTC m=+0.119987249 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 26 09:23:24 compute-1 podman[230255]: 2026-01-26 09:23:24.878257435 +0000 UTC m=+0.108375002 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 09:23:24 compute-1 podman[230241]: 2026-01-26 09:23:24.879691015 +0000 UTC m=+0.128418847 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 26 09:23:25 compute-1 sshd-session[230335]: Connection closed by 38.102.83.66 port 51010
Jan 26 09:23:25 compute-1 sshd-session[230239]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:23:25 compute-1 systemd[1]: session-142.scope: Deactivated successfully.
Jan 26 09:23:25 compute-1 systemd-logind[788]: Session 142 logged out. Waiting for processes to exit.
Jan 26 09:23:25 compute-1 systemd-logind[788]: Removed session 142.
Jan 26 09:23:25 compute-1 nova_compute[183083]: 2026-01-26 09:23:25.488 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:29 compute-1 nova_compute[183083]: 2026-01-26 09:23:29.350 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:30 compute-1 nova_compute[183083]: 2026-01-26 09:23:30.491 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:30 compute-1 nova_compute[183083]: 2026-01-26 09:23:30.963 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:23:30 compute-1 nova_compute[183083]: 2026-01-26 09:23:30.964 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:23:30 compute-1 nova_compute[183083]: 2026-01-26 09:23:30.964 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:23:30 compute-1 nova_compute[183083]: 2026-01-26 09:23:30.976 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:23:32 compute-1 nova_compute[183083]: 2026-01-26 09:23:32.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:23:32 compute-1 nova_compute[183083]: 2026-01-26 09:23:32.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:23:34 compute-1 ovn_controller[95352]: 2026-01-26T09:23:34Z|00367|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Jan 26 09:23:34 compute-1 nova_compute[183083]: 2026-01-26 09:23:34.352 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:35 compute-1 nova_compute[183083]: 2026-01-26 09:23:35.554 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:35 compute-1 nova_compute[183083]: 2026-01-26 09:23:35.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:23:35 compute-1 nova_compute[183083]: 2026-01-26 09:23:35.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:23:36 compute-1 nova_compute[183083]: 2026-01-26 09:23:36.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:23:36 compute-1 nova_compute[183083]: 2026-01-26 09:23:36.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:23:38 compute-1 podman[230374]: 2026-01-26 09:23:38.817148551 +0000 UTC m=+0.085275010 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 09:23:39 compute-1 nova_compute[183083]: 2026-01-26 09:23:39.353 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:39 compute-1 sshd-session[230398]: Connection closed by authenticating user root 178.62.249.31 port 50774 [preauth]
Jan 26 09:23:40 compute-1 nova_compute[183083]: 2026-01-26 09:23:40.595 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:41 compute-1 nova_compute[183083]: 2026-01-26 09:23:41.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:23:41 compute-1 nova_compute[183083]: 2026-01-26 09:23:41.962 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:23:41 compute-1 nova_compute[183083]: 2026-01-26 09:23:41.963 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:23:44 compute-1 nova_compute[183083]: 2026-01-26 09:23:44.356 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:45 compute-1 nova_compute[183083]: 2026-01-26 09:23:45.640 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:48 compute-1 nova_compute[183083]: 2026-01-26 09:23:48.953 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:23:49 compute-1 sshd-session[230400]: Invalid user sol from 2.57.122.238 port 52876
Jan 26 09:23:49 compute-1 sshd-session[230400]: Connection closed by invalid user sol 2.57.122.238 port 52876 [preauth]
Jan 26 09:23:49 compute-1 nova_compute[183083]: 2026-01-26 09:23:49.242 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:23:49 compute-1 nova_compute[183083]: 2026-01-26 09:23:49.242 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:23:49 compute-1 nova_compute[183083]: 2026-01-26 09:23:49.243 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:23:49 compute-1 nova_compute[183083]: 2026-01-26 09:23:49.243 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:23:49 compute-1 nova_compute[183083]: 2026-01-26 09:23:49.356 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:49 compute-1 nova_compute[183083]: 2026-01-26 09:23:49.428 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:23:49 compute-1 nova_compute[183083]: 2026-01-26 09:23:49.430 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13625MB free_disk=113.08377456665039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:23:49 compute-1 nova_compute[183083]: 2026-01-26 09:23:49.430 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:23:49 compute-1 nova_compute[183083]: 2026-01-26 09:23:49.430 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:23:49 compute-1 nova_compute[183083]: 2026-01-26 09:23:49.734 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:23:49 compute-1 nova_compute[183083]: 2026-01-26 09:23:49.735 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:23:49 compute-1 nova_compute[183083]: 2026-01-26 09:23:49.755 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:23:49 compute-1 nova_compute[183083]: 2026-01-26 09:23:49.826 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:23:49 compute-1 nova_compute[183083]: 2026-01-26 09:23:49.852 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:23:49 compute-1 nova_compute[183083]: 2026-01-26 09:23:49.852 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.422s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:23:50 compute-1 nova_compute[183083]: 2026-01-26 09:23:50.669 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:54 compute-1 nova_compute[183083]: 2026-01-26 09:23:54.358 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:55 compute-1 nova_compute[183083]: 2026-01-26 09:23:55.672 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:23:55 compute-1 podman[230408]: 2026-01-26 09:23:55.819217084 +0000 UTC m=+0.061533529 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 09:23:55 compute-1 podman[230406]: 2026-01-26 09:23:55.825784469 +0000 UTC m=+0.078579500 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-type=git)
Jan 26 09:23:55 compute-1 podman[230405]: 2026-01-26 09:23:55.835545585 +0000 UTC m=+0.092855873 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 09:23:55 compute-1 podman[230407]: 2026-01-26 09:23:55.84173052 +0000 UTC m=+0.091670390 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 26 09:23:55 compute-1 podman[230404]: 2026-01-26 09:23:55.856376713 +0000 UTC m=+0.111908171 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 09:23:59 compute-1 nova_compute[183083]: 2026-01-26 09:23:59.408 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:00 compute-1 nova_compute[183083]: 2026-01-26 09:24:00.720 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:24:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:24:04 compute-1 nova_compute[183083]: 2026-01-26 09:24:04.410 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:04 compute-1 ovn_controller[95352]: 2026-01-26T09:24:04Z|00368|pinctrl|WARN|Dropped 97 log messages in last 61 seconds (most recently, 6 seconds ago) due to excessive rate
Jan 26 09:24:04 compute-1 ovn_controller[95352]: 2026-01-26T09:24:04Z|00369|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:24:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:05.342 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:24:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:05.342 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:24:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:05.343 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:24:05 compute-1 nova_compute[183083]: 2026-01-26 09:24:05.769 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:09 compute-1 nova_compute[183083]: 2026-01-26 09:24:09.458 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:09 compute-1 podman[230511]: 2026-01-26 09:24:09.808207935 +0000 UTC m=+0.059037648 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:24:10 compute-1 nova_compute[183083]: 2026-01-26 09:24:10.806 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:14 compute-1 nova_compute[183083]: 2026-01-26 09:24:14.459 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:15 compute-1 nova_compute[183083]: 2026-01-26 09:24:15.809 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:15 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:15.934 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:24:15 compute-1 nova_compute[183083]: 2026-01-26 09:24:15.934 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:15 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:15.935 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:24:16 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:16.938 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:24:19 compute-1 nova_compute[183083]: 2026-01-26 09:24:19.461 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:20 compute-1 nova_compute[183083]: 2026-01-26 09:24:20.857 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.156 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Acquiring lock "8bd1b013-4c5d-417a-ac90-fee845c3d159" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.156 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lock "8bd1b013-4c5d-417a-ac90-fee845c3d159" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.177 183087 DEBUG nova.compute.manager [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.243 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.243 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.250 183087 DEBUG nova.virt.hardware [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.251 183087 INFO nova.compute.claims [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Claim successful on node compute-1.ctlplane.example.com
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.339 183087 DEBUG nova.compute.provider_tree [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.353 183087 DEBUG nova.scheduler.client.report [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.368 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.369 183087 DEBUG nova.compute.manager [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.407 183087 DEBUG nova.compute.manager [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.407 183087 DEBUG nova.network.neutron [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.423 183087 INFO nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.438 183087 DEBUG nova.compute.manager [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.520 183087 DEBUG nova.compute.manager [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.522 183087 DEBUG nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.522 183087 INFO nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Creating image(s)
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.523 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Acquiring lock "/var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.523 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lock "/var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.524 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lock "/var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.535 183087 DEBUG oslo_concurrency.processutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.628 183087 DEBUG oslo_concurrency.processutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.630 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Acquiring lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.630 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.642 183087 DEBUG oslo_concurrency.processutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.735 183087 DEBUG oslo_concurrency.processutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.736 183087 DEBUG oslo_concurrency.processutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.792 183087 DEBUG oslo_concurrency.processutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52,backing_fmt=raw /var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.794 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lock "ee71f12e6da9d8023ebfba31a1bad603881dbd52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.794 183087 DEBUG oslo_concurrency.processutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.867 183087 DEBUG oslo_concurrency.processutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ee71f12e6da9d8023ebfba31a1bad603881dbd52 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.868 183087 DEBUG nova.virt.disk.api [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Checking if we can resize image /var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.869 183087 DEBUG oslo_concurrency.processutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.933 183087 DEBUG oslo_concurrency.processutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.934 183087 DEBUG nova.virt.disk.api [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Cannot resize image /var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.935 183087 DEBUG nova.objects.instance [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lazy-loading 'migration_context' on Instance uuid 8bd1b013-4c5d-417a-ac90-fee845c3d159 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.950 183087 DEBUG nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.951 183087 DEBUG nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Ensure instance console log exists: /var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.951 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.951 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:24:21 compute-1 nova_compute[183083]: 2026-01-26 09:24:21.952 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:24:22 compute-1 nova_compute[183083]: 2026-01-26 09:24:22.477 183087 DEBUG nova.policy [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fea5fdfe49874e138b7b4a69a915e73e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2968fec2ed804be1b674f6246298b67a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 09:24:24 compute-1 nova_compute[183083]: 2026-01-26 09:24:24.127 183087 DEBUG nova.network.neutron [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Successfully created port: 5aa77a60-fcf0-426b-b9e5-2ff2957856e8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 09:24:24 compute-1 nova_compute[183083]: 2026-01-26 09:24:24.463 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:25 compute-1 nova_compute[183083]: 2026-01-26 09:24:25.516 183087 DEBUG nova.network.neutron [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Successfully updated port: 5aa77a60-fcf0-426b-b9e5-2ff2957856e8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 09:24:25 compute-1 nova_compute[183083]: 2026-01-26 09:24:25.531 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Acquiring lock "refresh_cache-8bd1b013-4c5d-417a-ac90-fee845c3d159" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:24:25 compute-1 nova_compute[183083]: 2026-01-26 09:24:25.531 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Acquired lock "refresh_cache-8bd1b013-4c5d-417a-ac90-fee845c3d159" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:24:25 compute-1 nova_compute[183083]: 2026-01-26 09:24:25.531 183087 DEBUG nova.network.neutron [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 09:24:25 compute-1 nova_compute[183083]: 2026-01-26 09:24:25.689 183087 DEBUG nova.network.neutron [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 09:24:25 compute-1 nova_compute[183083]: 2026-01-26 09:24:25.910 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:25 compute-1 nova_compute[183083]: 2026-01-26 09:24:25.949 183087 DEBUG nova.compute.manager [req-5c8f2e92-906b-4701-bfd8-cfb93789465d req-667ec792-7aa8-4d4a-9d33-ca256bd66c08 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Received event network-changed-5aa77a60-fcf0-426b-b9e5-2ff2957856e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:24:25 compute-1 nova_compute[183083]: 2026-01-26 09:24:25.950 183087 DEBUG nova.compute.manager [req-5c8f2e92-906b-4701-bfd8-cfb93789465d req-667ec792-7aa8-4d4a-9d33-ca256bd66c08 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Refreshing instance network info cache due to event network-changed-5aa77a60-fcf0-426b-b9e5-2ff2957856e8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:24:25 compute-1 nova_compute[183083]: 2026-01-26 09:24:25.950 183087 DEBUG oslo_concurrency.lockutils [req-5c8f2e92-906b-4701-bfd8-cfb93789465d req-667ec792-7aa8-4d4a-9d33-ca256bd66c08 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-8bd1b013-4c5d-417a-ac90-fee845c3d159" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:24:26 compute-1 sshd-session[230550]: Connection closed by authenticating user root 178.62.249.31 port 39624 [preauth]
Jan 26 09:24:26 compute-1 podman[230556]: 2026-01-26 09:24:26.815181137 +0000 UTC m=+0.065531482 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 09:24:26 compute-1 podman[230555]: 2026-01-26 09:24:26.824374706 +0000 UTC m=+0.071966473 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 26 09:24:26 compute-1 podman[230554]: 2026-01-26 09:24:26.831406445 +0000 UTC m=+0.090116666 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 26 09:24:26 compute-1 podman[230552]: 2026-01-26 09:24:26.840562624 +0000 UTC m=+0.100836759 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 09:24:26 compute-1 podman[230553]: 2026-01-26 09:24:26.86697647 +0000 UTC m=+0.115461692 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.839 183087 DEBUG nova.network.neutron [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Updating instance_info_cache with network_info: [{"id": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "address": "fa:16:3e:92:c3:07", "network": {"id": "df9227db-8d5b-40c3-bc17-cf71e8c5c2d4", "bridge": "br-int", "label": "tempest-test-network--112460421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2968fec2ed804be1b674f6246298b67a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa77a60-fc", "ovs_interfaceid": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.859 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Releasing lock "refresh_cache-8bd1b013-4c5d-417a-ac90-fee845c3d159" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.860 183087 DEBUG nova.compute.manager [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Instance network_info: |[{"id": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "address": "fa:16:3e:92:c3:07", "network": {"id": "df9227db-8d5b-40c3-bc17-cf71e8c5c2d4", "bridge": "br-int", "label": "tempest-test-network--112460421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2968fec2ed804be1b674f6246298b67a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa77a60-fc", "ovs_interfaceid": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.861 183087 DEBUG oslo_concurrency.lockutils [req-5c8f2e92-906b-4701-bfd8-cfb93789465d req-667ec792-7aa8-4d4a-9d33-ca256bd66c08 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-8bd1b013-4c5d-417a-ac90-fee845c3d159" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.861 183087 DEBUG nova.network.neutron [req-5c8f2e92-906b-4701-bfd8-cfb93789465d req-667ec792-7aa8-4d4a-9d33-ca256bd66c08 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Refreshing network info cache for port 5aa77a60-fcf0-426b-b9e5-2ff2957856e8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.865 183087 DEBUG nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Start _get_guest_xml network_info=[{"id": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "address": "fa:16:3e:92:c3:07", "network": {"id": "df9227db-8d5b-40c3-bc17-cf71e8c5c2d4", "bridge": "br-int", "label": "tempest-test-network--112460421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2968fec2ed804be1b674f6246298b67a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa77a60-fc", "ovs_interfaceid": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encryption_options': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '13d1a20a-8003-4f19-aba7-ccbd9eff9b82'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.873 183087 WARNING nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.881 183087 DEBUG nova.virt.libvirt.host [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.882 183087 DEBUG nova.virt.libvirt.host [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.885 183087 DEBUG nova.virt.libvirt.host [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.886 183087 DEBUG nova.virt.libvirt.host [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.886 183087 DEBUG nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.886 183087 DEBUG nova.virt.hardware [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T08:43:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7017323-cd35-470d-a718-faa6c6e97277',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T08:43:05Z,direct_url=<?>,disk_format='qcow2',id=13d1a20a-8003-4f19-aba7-ccbd9eff9b82,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aa88264c487a43b2855a58eb0dd042c9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T08:43:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.887 183087 DEBUG nova.virt.hardware [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.887 183087 DEBUG nova.virt.hardware [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.887 183087 DEBUG nova.virt.hardware [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.887 183087 DEBUG nova.virt.hardware [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.887 183087 DEBUG nova.virt.hardware [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.888 183087 DEBUG nova.virt.hardware [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.888 183087 DEBUG nova.virt.hardware [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.888 183087 DEBUG nova.virt.hardware [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.888 183087 DEBUG nova.virt.hardware [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.888 183087 DEBUG nova.virt.hardware [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.892 183087 DEBUG nova.virt.libvirt.vif [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T09:24:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-591621980',display_name='tempest-server-test-591621980',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-591621980',id=62,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ0In/4xMyfKShlxQSlxjG12nbv2C0HX1XSV6OVVsf7TFR49jvQjBw/toy9vczCc/4qWnGwOBcP7ZM2MCPR8MzRrcEwdBYas24+LrNAh4L4utPPYRBQXISG/+xLpH9Pyrw==',key_name='tempest-keypair-test-314781837',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2968fec2ed804be1b674f6246298b67a',ramdisk_id='',reservation_id='r-xqlu70x3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnFdbAgingTest-1512062367',owner_user_name='tempest-OvnFdbAgingTest-1512062367-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:24:21Z,user_data=None,user_id='fea5fdfe49874e138b7b4a69a915e73e',uuid=8bd1b013-4c5d-417a-ac90-fee845c3d159,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "address": "fa:16:3e:92:c3:07", "network": {"id": "df9227db-8d5b-40c3-bc17-cf71e8c5c2d4", "bridge": "br-int", "label": "tempest-test-network--112460421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2968fec2ed804be1b674f6246298b67a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa77a60-fc", "ovs_interfaceid": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.893 183087 DEBUG nova.network.os_vif_util [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Converting VIF {"id": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "address": "fa:16:3e:92:c3:07", "network": {"id": "df9227db-8d5b-40c3-bc17-cf71e8c5c2d4", "bridge": "br-int", "label": "tempest-test-network--112460421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2968fec2ed804be1b674f6246298b67a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa77a60-fc", "ovs_interfaceid": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.893 183087 DEBUG nova.network.os_vif_util [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:c3:07,bridge_name='br-int',has_traffic_filtering=True,id=5aa77a60-fcf0-426b-b9e5-2ff2957856e8,network=Network(df9227db-8d5b-40c3-bc17-cf71e8c5c2d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa77a60-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.894 183087 DEBUG nova.objects.instance [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lazy-loading 'pci_devices' on Instance uuid 8bd1b013-4c5d-417a-ac90-fee845c3d159 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.907 183087 DEBUG nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] End _get_guest_xml xml=<domain type="kvm">
Jan 26 09:24:27 compute-1 nova_compute[183083]:   <uuid>8bd1b013-4c5d-417a-ac90-fee845c3d159</uuid>
Jan 26 09:24:27 compute-1 nova_compute[183083]:   <name>instance-0000003e</name>
Jan 26 09:24:27 compute-1 nova_compute[183083]:   <memory>131072</memory>
Jan 26 09:24:27 compute-1 nova_compute[183083]:   <vcpu>1</vcpu>
Jan 26 09:24:27 compute-1 nova_compute[183083]:   <metadata>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <nova:name>tempest-server-test-591621980</nova:name>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <nova:creationTime>2026-01-26 09:24:27</nova:creationTime>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <nova:flavor name="m1.nano">
Jan 26 09:24:27 compute-1 nova_compute[183083]:         <nova:memory>128</nova:memory>
Jan 26 09:24:27 compute-1 nova_compute[183083]:         <nova:disk>1</nova:disk>
Jan 26 09:24:27 compute-1 nova_compute[183083]:         <nova:swap>0</nova:swap>
Jan 26 09:24:27 compute-1 nova_compute[183083]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 09:24:27 compute-1 nova_compute[183083]:         <nova:vcpus>1</nova:vcpus>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       </nova:flavor>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <nova:owner>
Jan 26 09:24:27 compute-1 nova_compute[183083]:         <nova:user uuid="fea5fdfe49874e138b7b4a69a915e73e">tempest-OvnFdbAgingTest-1512062367-project-member</nova:user>
Jan 26 09:24:27 compute-1 nova_compute[183083]:         <nova:project uuid="2968fec2ed804be1b674f6246298b67a">tempest-OvnFdbAgingTest-1512062367</nova:project>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       </nova:owner>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <nova:root type="image" uuid="13d1a20a-8003-4f19-aba7-ccbd9eff9b82"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <nova:ports>
Jan 26 09:24:27 compute-1 nova_compute[183083]:         <nova:port uuid="5aa77a60-fcf0-426b-b9e5-2ff2957856e8">
Jan 26 09:24:27 compute-1 nova_compute[183083]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:         </nova:port>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       </nova:ports>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     </nova:instance>
Jan 26 09:24:27 compute-1 nova_compute[183083]:   </metadata>
Jan 26 09:24:27 compute-1 nova_compute[183083]:   <sysinfo type="smbios">
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <system>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <entry name="manufacturer">RDO</entry>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <entry name="product">OpenStack Compute</entry>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <entry name="serial">8bd1b013-4c5d-417a-ac90-fee845c3d159</entry>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <entry name="uuid">8bd1b013-4c5d-417a-ac90-fee845c3d159</entry>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <entry name="family">Virtual Machine</entry>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     </system>
Jan 26 09:24:27 compute-1 nova_compute[183083]:   </sysinfo>
Jan 26 09:24:27 compute-1 nova_compute[183083]:   <os>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <boot dev="hd"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <smbios mode="sysinfo"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:   </os>
Jan 26 09:24:27 compute-1 nova_compute[183083]:   <features>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <acpi/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <apic/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <vmcoreinfo/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:   </features>
Jan 26 09:24:27 compute-1 nova_compute[183083]:   <clock offset="utc">
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <timer name="hpet" present="no"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:   </clock>
Jan 26 09:24:27 compute-1 nova_compute[183083]:   <cpu mode="host-model" match="exact">
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:   </cpu>
Jan 26 09:24:27 compute-1 nova_compute[183083]:   <devices>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <disk type="file" device="disk">
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/disk"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <target dev="vda" bus="virtio"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     </disk>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <disk type="file" device="cdrom">
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <source file="/var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/disk.config"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <target dev="sda" bus="sata"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     </disk>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <interface type="ethernet">
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <mac address="fa:16:3e:92:c3:07"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <mtu size="1342"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <target dev="tap5aa77a60-fc"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     </interface>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <serial type="pty">
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <log file="/var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/console.log" append="off"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     </serial>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <video>
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <model type="virtio"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     </video>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <input type="tablet" bus="usb"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <rng model="virtio">
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <backend model="random">/dev/urandom</backend>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     </rng>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <controller type="usb" index="0"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     <memballoon model="virtio">
Jan 26 09:24:27 compute-1 nova_compute[183083]:       <stats period="10"/>
Jan 26 09:24:27 compute-1 nova_compute[183083]:     </memballoon>
Jan 26 09:24:27 compute-1 nova_compute[183083]:   </devices>
Jan 26 09:24:27 compute-1 nova_compute[183083]: </domain>
Jan 26 09:24:27 compute-1 nova_compute[183083]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.908 183087 DEBUG nova.compute.manager [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Preparing to wait for external event network-vif-plugged-5aa77a60-fcf0-426b-b9e5-2ff2957856e8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.909 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Acquiring lock "8bd1b013-4c5d-417a-ac90-fee845c3d159-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.909 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lock "8bd1b013-4c5d-417a-ac90-fee845c3d159-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.909 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lock "8bd1b013-4c5d-417a-ac90-fee845c3d159-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.910 183087 DEBUG nova.virt.libvirt.vif [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T09:24:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-591621980',display_name='tempest-server-test-591621980',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-591621980',id=62,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ0In/4xMyfKShlxQSlxjG12nbv2C0HX1XSV6OVVsf7TFR49jvQjBw/toy9vczCc/4qWnGwOBcP7ZM2MCPR8MzRrcEwdBYas24+LrNAh4L4utPPYRBQXISG/+xLpH9Pyrw==',key_name='tempest-keypair-test-314781837',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2968fec2ed804be1b674f6246298b67a',ramdisk_id='',reservation_id='r-xqlu70x3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnFdbAgingTest-1512062367',owner_user_name='tempest-OvnFdbAgingTest-1512062367-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:24:21Z,user_data=None,user_id='fea5fdfe49874e138b7b4a69a915e73e',uuid=8bd1b013-4c5d-417a-ac90-fee845c3d159,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "address": "fa:16:3e:92:c3:07", "network": {"id": "df9227db-8d5b-40c3-bc17-cf71e8c5c2d4", "bridge": "br-int", "label": "tempest-test-network--112460421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2968fec2ed804be1b674f6246298b67a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa77a60-fc", "ovs_interfaceid": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.910 183087 DEBUG nova.network.os_vif_util [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Converting VIF {"id": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "address": "fa:16:3e:92:c3:07", "network": {"id": "df9227db-8d5b-40c3-bc17-cf71e8c5c2d4", "bridge": "br-int", "label": "tempest-test-network--112460421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2968fec2ed804be1b674f6246298b67a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa77a60-fc", "ovs_interfaceid": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.911 183087 DEBUG nova.network.os_vif_util [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:c3:07,bridge_name='br-int',has_traffic_filtering=True,id=5aa77a60-fcf0-426b-b9e5-2ff2957856e8,network=Network(df9227db-8d5b-40c3-bc17-cf71e8c5c2d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa77a60-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.911 183087 DEBUG os_vif [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:c3:07,bridge_name='br-int',has_traffic_filtering=True,id=5aa77a60-fcf0-426b-b9e5-2ff2957856e8,network=Network(df9227db-8d5b-40c3-bc17-cf71e8c5c2d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa77a60-fc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.912 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.912 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.913 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.916 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.916 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5aa77a60-fc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.917 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5aa77a60-fc, col_values=(('external_ids', {'iface-id': '5aa77a60-fcf0-426b-b9e5-2ff2957856e8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:92:c3:07', 'vm-uuid': '8bd1b013-4c5d-417a-ac90-fee845c3d159'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.919 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:27 compute-1 NetworkManager[55451]: <info>  [1769419467.9205] manager: (tap5aa77a60-fc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/117)
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.921 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.930 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.931 183087 INFO os_vif [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:c3:07,bridge_name='br-int',has_traffic_filtering=True,id=5aa77a60-fcf0-426b-b9e5-2ff2957856e8,network=Network(df9227db-8d5b-40c3-bc17-cf71e8c5c2d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa77a60-fc')
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.990 183087 DEBUG nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.991 183087 DEBUG nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.991 183087 DEBUG nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] No VIF found with MAC fa:16:3e:92:c3:07, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 26 09:24:27 compute-1 nova_compute[183083]: 2026-01-26 09:24:27.991 183087 INFO nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Using config drive
Jan 26 09:24:28 compute-1 nova_compute[183083]: 2026-01-26 09:24:28.558 183087 INFO nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Creating config drive at /var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/disk.config
Jan 26 09:24:28 compute-1 nova_compute[183083]: 2026-01-26 09:24:28.563 183087 DEBUG oslo_concurrency.processutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdx9vaije execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:24:28 compute-1 nova_compute[183083]: 2026-01-26 09:24:28.694 183087 DEBUG oslo_concurrency.processutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdx9vaije" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:24:28 compute-1 kernel: tap5aa77a60-fc: entered promiscuous mode
Jan 26 09:24:28 compute-1 NetworkManager[55451]: <info>  [1769419468.8021] manager: (tap5aa77a60-fc): new Tun device (/org/freedesktop/NetworkManager/Devices/118)
Jan 26 09:24:28 compute-1 ovn_controller[95352]: 2026-01-26T09:24:28Z|00370|binding|INFO|Claiming lport 5aa77a60-fcf0-426b-b9e5-2ff2957856e8 for this chassis.
Jan 26 09:24:28 compute-1 nova_compute[183083]: 2026-01-26 09:24:28.801 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:28 compute-1 ovn_controller[95352]: 2026-01-26T09:24:28Z|00371|binding|INFO|5aa77a60-fcf0-426b-b9e5-2ff2957856e8: Claiming fa:16:3e:92:c3:07 10.100.0.3
Jan 26 09:24:28 compute-1 systemd-udevd[230671]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 09:24:28 compute-1 NetworkManager[55451]: <info>  [1769419468.8588] device (tap5aa77a60-fc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 09:24:28 compute-1 NetworkManager[55451]: <info>  [1769419468.8600] device (tap5aa77a60-fc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 09:24:28 compute-1 systemd-machined[154360]: New machine qemu-22-instance-0000003e.
Jan 26 09:24:28 compute-1 nova_compute[183083]: 2026-01-26 09:24:28.877 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:28 compute-1 ovn_controller[95352]: 2026-01-26T09:24:28Z|00372|binding|INFO|Setting lport 5aa77a60-fcf0-426b-b9e5-2ff2957856e8 ovn-installed in OVS
Jan 26 09:24:28 compute-1 nova_compute[183083]: 2026-01-26 09:24:28.890 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:28 compute-1 systemd[1]: Started Virtual Machine qemu-22-instance-0000003e.
Jan 26 09:24:29 compute-1 ovn_controller[95352]: 2026-01-26T09:24:29Z|00373|binding|INFO|Setting lport 5aa77a60-fcf0-426b-b9e5-2ff2957856e8 up in Southbound
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.011 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:c3:07 10.100.0.3'], port_security=['fa:16:3e:92:c3:07 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8bd1b013-4c5d-417a-ac90-fee845c3d159', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2968fec2ed804be1b674f6246298b67a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c98f1787-876c-4082-a21c-ef37670072b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94f8421d-595b-4d62-815b-fdcc7cbf6286, chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=5aa77a60-fcf0-426b-b9e5-2ff2957856e8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.013 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 5aa77a60-fcf0-426b-b9e5-2ff2957856e8 in datapath df9227db-8d5b-40c3-bc17-cf71e8c5c2d4 bound to our chassis
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.014 104632 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network df9227db-8d5b-40c3-bc17-cf71e8c5c2d4
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.031 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[3f0ce786-41a3-468d-a7fc-4cf5446c7a80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.032 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdf9227db-81 in ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.036 212483 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdf9227db-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.036 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[d2cf2605-59cb-4e1b-847d-9fcf6b49a4ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.037 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[06f19954-383f-470f-b3ed-b214d086b341]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.055 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[54a6fb91-6382-467d-96ab-6466b1319e2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.072 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[c4cbdd38-9165-4051-8f5c-57aae4876964]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.103 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[4a0297f6-8ce4-4e16-8baa-1fd5e0b59766]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:29 compute-1 NetworkManager[55451]: <info>  [1769419469.1125] manager: (tapdf9227db-80): new Veth device (/org/freedesktop/NetworkManager/Devices/119)
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.111 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[8455b6d7-6d5d-44e1-bf68-9a01c7831e9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.161 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[21bef348-db0d-4c92-8093-3d93b862f775]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.165 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[940575f3-fade-488a-a1df-76fb706a70bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:29 compute-1 NetworkManager[55451]: <info>  [1769419469.1897] device (tapdf9227db-80): carrier: link connected
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.196 212523 DEBUG oslo.privsep.daemon [-] privsep: reply[15ef6ca4-884d-47cc-b636-bce131c81df8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.215 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b5736c81-6e94-40d8-9196-75067c21273a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf9227db-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:47:6d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580979, 'reachable_time': 36189, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230706, 'error': None, 'target': 'ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.219 183087 DEBUG nova.network.neutron [req-5c8f2e92-906b-4701-bfd8-cfb93789465d req-667ec792-7aa8-4d4a-9d33-ca256bd66c08 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Updated VIF entry in instance network info cache for port 5aa77a60-fcf0-426b-b9e5-2ff2957856e8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.220 183087 DEBUG nova.network.neutron [req-5c8f2e92-906b-4701-bfd8-cfb93789465d req-667ec792-7aa8-4d4a-9d33-ca256bd66c08 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Updating instance_info_cache with network_info: [{"id": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "address": "fa:16:3e:92:c3:07", "network": {"id": "df9227db-8d5b-40c3-bc17-cf71e8c5c2d4", "bridge": "br-int", "label": "tempest-test-network--112460421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2968fec2ed804be1b674f6246298b67a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa77a60-fc", "ovs_interfaceid": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.239 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[149d1a42-0597-4e6d-a32c-7fbc28d2b1fb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb8:476d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580979, 'tstamp': 580979}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230707, 'error': None, 'target': 'ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.254 183087 DEBUG oslo_concurrency.lockutils [req-5c8f2e92-906b-4701-bfd8-cfb93789465d req-667ec792-7aa8-4d4a-9d33-ca256bd66c08 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-8bd1b013-4c5d-417a-ac90-fee845c3d159" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.259 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[74772fab-39c3-46af-8e27-b25e2191ba23]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf9227db-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:47:6d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580979, 'reachable_time': 36189, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230708, 'error': None, 'target': 'ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.297 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[89ab841a-92c9-4c97-ab4c-1b7e78500714]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.343 183087 DEBUG nova.compute.manager [req-bcb1cb34-3c41-4553-8645-3878e0ef65c2 req-31be41c6-524b-416b-a6dd-bc395cf0c2a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Received event network-vif-plugged-5aa77a60-fcf0-426b-b9e5-2ff2957856e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.343 183087 DEBUG oslo_concurrency.lockutils [req-bcb1cb34-3c41-4553-8645-3878e0ef65c2 req-31be41c6-524b-416b-a6dd-bc395cf0c2a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "8bd1b013-4c5d-417a-ac90-fee845c3d159-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.344 183087 DEBUG oslo_concurrency.lockutils [req-bcb1cb34-3c41-4553-8645-3878e0ef65c2 req-31be41c6-524b-416b-a6dd-bc395cf0c2a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "8bd1b013-4c5d-417a-ac90-fee845c3d159-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.344 183087 DEBUG oslo_concurrency.lockutils [req-bcb1cb34-3c41-4553-8645-3878e0ef65c2 req-31be41c6-524b-416b-a6dd-bc395cf0c2a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "8bd1b013-4c5d-417a-ac90-fee845c3d159-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.344 183087 DEBUG nova.compute.manager [req-bcb1cb34-3c41-4553-8645-3878e0ef65c2 req-31be41c6-524b-416b-a6dd-bc395cf0c2a2 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Processing event network-vif-plugged-5aa77a60-fcf0-426b-b9e5-2ff2957856e8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.365 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[435ab1fd-1043-4183-90e1-30554fef4952]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.368 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf9227db-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.368 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.369 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf9227db-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:24:29 compute-1 NetworkManager[55451]: <info>  [1769419469.3717] manager: (tapdf9227db-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/120)
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.371 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:29 compute-1 kernel: tapdf9227db-80: entered promiscuous mode
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.375 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdf9227db-80, col_values=(('external_ids', {'iface-id': '996988ab-7aaf-41cc-9cc5-8cd9afda3702'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.377 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:29 compute-1 ovn_controller[95352]: 2026-01-26T09:24:29Z|00374|binding|INFO|Releasing lport 996988ab-7aaf-41cc-9cc5-8cd9afda3702 from this chassis (sb_readonly=0)
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.378 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.378 104632 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/df9227db-8d5b-40c3-bc17-cf71e8c5c2d4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/df9227db-8d5b-40c3-bc17-cf71e8c5c2d4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.380 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[183c4795-aced-465b-9042-cc90027c6539]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.381 104632 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: global
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     log         /dev/log local0 debug
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     log-tag     haproxy-metadata-proxy-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     user        root
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     group       root
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     maxconn     1024
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     pidfile     /var/lib/neutron/external/pids/df9227db-8d5b-40c3-bc17-cf71e8c5c2d4.pid.haproxy
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     daemon
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: defaults
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     log global
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     mode http
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     option httplog
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     option dontlognull
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     option http-server-close
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     option forwardfor
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     retries                 3
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     timeout http-request    30s
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     timeout connect         30s
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     timeout client          32s
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     timeout server          32s
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     timeout http-keep-alive 30s
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: listen listener
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     bind 169.254.169.254:80
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:     http-request add-header X-OVN-Network-ID df9227db-8d5b-40c3-bc17-cf71e8c5c2d4
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 26 09:24:29 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:29.382 104632 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4', 'env', 'PROCESS_TAG=haproxy-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/df9227db-8d5b-40c3-bc17-cf71e8c5c2d4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.382 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769419469.3818135, 8bd1b013-4c5d-417a-ac90-fee845c3d159 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.383 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] VM Started (Lifecycle Event)
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.386 183087 DEBUG nova.compute.manager [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.389 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.392 183087 DEBUG nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.396 183087 INFO nova.virt.libvirt.driver [-] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Instance spawned successfully.
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.396 183087 DEBUG nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.414 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.419 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.428 183087 DEBUG nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.429 183087 DEBUG nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.430 183087 DEBUG nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.430 183087 DEBUG nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.431 183087 DEBUG nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.431 183087 DEBUG nova.virt.libvirt.driver [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.457 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.458 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769419469.383309, 8bd1b013-4c5d-417a-ac90-fee845c3d159 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.459 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] VM Paused (Lifecycle Event)
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.465 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.501 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.506 183087 DEBUG nova.virt.driver [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] Emitting event <LifecycleEvent: 1769419469.391399, 8bd1b013-4c5d-417a-ac90-fee845c3d159 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.507 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] VM Resumed (Lifecycle Event)
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.513 183087 INFO nova.compute.manager [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Took 7.99 seconds to spawn the instance on the hypervisor.
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.514 183087 DEBUG nova.compute.manager [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.527 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.531 183087 DEBUG nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.558 183087 INFO nova.compute.manager [None req-794212b8-67f1-43ca-9ec6-ad6cfdcb2e69 - - - - - -] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.581 183087 INFO nova.compute.manager [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Took 8.36 seconds to build instance.
Jan 26 09:24:29 compute-1 nova_compute[183083]: 2026-01-26 09:24:29.600 183087 DEBUG oslo_concurrency.lockutils [None req-e0723688-567b-4e9f-bd4f-69349c80c7aa fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lock "8bd1b013-4c5d-417a-ac90-fee845c3d159" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.444s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:24:29 compute-1 podman[230747]: 2026-01-26 09:24:29.813021449 +0000 UTC m=+0.061351834 container create 3009534bdeabf5a3e608ca03551dcd217dee5a1933ff0c1107b500d2351b5cb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 26 09:24:29 compute-1 systemd[1]: Started libpod-conmon-3009534bdeabf5a3e608ca03551dcd217dee5a1933ff0c1107b500d2351b5cb2.scope.
Jan 26 09:24:29 compute-1 podman[230747]: 2026-01-26 09:24:29.781944851 +0000 UTC m=+0.030275236 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 09:24:29 compute-1 systemd[1]: Started libcrun container.
Jan 26 09:24:29 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7de1c7d8da66dd848730ac41207649d81cef879ab2099857f9ce3df7cbaf19ac/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 09:24:29 compute-1 podman[230747]: 2026-01-26 09:24:29.930690912 +0000 UTC m=+0.179021367 container init 3009534bdeabf5a3e608ca03551dcd217dee5a1933ff0c1107b500d2351b5cb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 09:24:29 compute-1 podman[230747]: 2026-01-26 09:24:29.941833136 +0000 UTC m=+0.190163551 container start 3009534bdeabf5a3e608ca03551dcd217dee5a1933ff0c1107b500d2351b5cb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 26 09:24:29 compute-1 neutron-haproxy-ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4[230762]: [NOTICE]   (230766) : New worker (230768) forked
Jan 26 09:24:29 compute-1 neutron-haproxy-ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4[230762]: [NOTICE]   (230766) : Loading success.
Jan 26 09:24:30 compute-1 nova_compute[183083]: 2026-01-26 09:24:30.557 183087 INFO nova.compute.manager [None req-50a2c389-8c48-4147-9c1a-fa32652716b6 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Get console output
Jan 26 09:24:30 compute-1 nova_compute[183083]: 2026-01-26 09:24:30.563 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 09:24:31 compute-1 nova_compute[183083]: 2026-01-26 09:24:31.425 183087 DEBUG nova.compute.manager [req-9e68a27a-e6f2-4308-be24-52177c0b8191 req-f4f199a7-4453-4bcc-aaea-45328faccc92 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Received event network-vif-plugged-5aa77a60-fcf0-426b-b9e5-2ff2957856e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:24:31 compute-1 nova_compute[183083]: 2026-01-26 09:24:31.426 183087 DEBUG oslo_concurrency.lockutils [req-9e68a27a-e6f2-4308-be24-52177c0b8191 req-f4f199a7-4453-4bcc-aaea-45328faccc92 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "8bd1b013-4c5d-417a-ac90-fee845c3d159-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:24:31 compute-1 nova_compute[183083]: 2026-01-26 09:24:31.426 183087 DEBUG oslo_concurrency.lockutils [req-9e68a27a-e6f2-4308-be24-52177c0b8191 req-f4f199a7-4453-4bcc-aaea-45328faccc92 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "8bd1b013-4c5d-417a-ac90-fee845c3d159-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:24:31 compute-1 nova_compute[183083]: 2026-01-26 09:24:31.427 183087 DEBUG oslo_concurrency.lockutils [req-9e68a27a-e6f2-4308-be24-52177c0b8191 req-f4f199a7-4453-4bcc-aaea-45328faccc92 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "8bd1b013-4c5d-417a-ac90-fee845c3d159-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:24:31 compute-1 nova_compute[183083]: 2026-01-26 09:24:31.427 183087 DEBUG nova.compute.manager [req-9e68a27a-e6f2-4308-be24-52177c0b8191 req-f4f199a7-4453-4bcc-aaea-45328faccc92 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] No waiting events found dispatching network-vif-plugged-5aa77a60-fcf0-426b-b9e5-2ff2957856e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:24:31 compute-1 nova_compute[183083]: 2026-01-26 09:24:31.428 183087 WARNING nova.compute.manager [req-9e68a27a-e6f2-4308-be24-52177c0b8191 req-f4f199a7-4453-4bcc-aaea-45328faccc92 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Received unexpected event network-vif-plugged-5aa77a60-fcf0-426b-b9e5-2ff2957856e8 for instance with vm_state active and task_state None.
Jan 26 09:24:31 compute-1 nova_compute[183083]: 2026-01-26 09:24:31.850 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:24:31 compute-1 nova_compute[183083]: 2026-01-26 09:24:31.852 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:24:31 compute-1 nova_compute[183083]: 2026-01-26 09:24:31.853 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:24:32 compute-1 nova_compute[183083]: 2026-01-26 09:24:32.459 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "refresh_cache-8bd1b013-4c5d-417a-ac90-fee845c3d159" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:24:32 compute-1 nova_compute[183083]: 2026-01-26 09:24:32.459 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquired lock "refresh_cache-8bd1b013-4c5d-417a-ac90-fee845c3d159" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:24:32 compute-1 nova_compute[183083]: 2026-01-26 09:24:32.459 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 26 09:24:32 compute-1 nova_compute[183083]: 2026-01-26 09:24:32.460 183087 DEBUG nova.objects.instance [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8bd1b013-4c5d-417a-ac90-fee845c3d159 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:24:32 compute-1 nova_compute[183083]: 2026-01-26 09:24:32.950 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:33 compute-1 nova_compute[183083]: 2026-01-26 09:24:33.725 183087 DEBUG nova.network.neutron [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Updating instance_info_cache with network_info: [{"id": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "address": "fa:16:3e:92:c3:07", "network": {"id": "df9227db-8d5b-40c3-bc17-cf71e8c5c2d4", "bridge": "br-int", "label": "tempest-test-network--112460421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2968fec2ed804be1b674f6246298b67a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa77a60-fc", "ovs_interfaceid": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:24:33 compute-1 nova_compute[183083]: 2026-01-26 09:24:33.760 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Releasing lock "refresh_cache-8bd1b013-4c5d-417a-ac90-fee845c3d159" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:24:33 compute-1 nova_compute[183083]: 2026-01-26 09:24:33.761 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 26 09:24:33 compute-1 nova_compute[183083]: 2026-01-26 09:24:33.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:24:33 compute-1 nova_compute[183083]: 2026-01-26 09:24:33.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:24:34 compute-1 nova_compute[183083]: 2026-01-26 09:24:34.467 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:35 compute-1 nova_compute[183083]: 2026-01-26 09:24:35.653 183087 INFO nova.compute.manager [None req-e73ff479-3b2e-4d14-a34a-d774afc2c49a fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Get console output
Jan 26 09:24:37 compute-1 nova_compute[183083]: 2026-01-26 09:24:37.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:24:37 compute-1 nova_compute[183083]: 2026-01-26 09:24:37.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:24:37 compute-1 nova_compute[183083]: 2026-01-26 09:24:37.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:24:37 compute-1 nova_compute[183083]: 2026-01-26 09:24:37.953 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:38 compute-1 nova_compute[183083]: 2026-01-26 09:24:38.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:24:39 compute-1 nova_compute[183083]: 2026-01-26 09:24:39.469 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:40 compute-1 podman[230783]: 2026-01-26 09:24:40.826742046 +0000 UTC m=+0.075945066 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:24:40 compute-1 nova_compute[183083]: 2026-01-26 09:24:40.853 183087 INFO nova.compute.manager [None req-ecda07ba-5e84-4e5f-b28d-74e5c771da1e fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Get console output
Jan 26 09:24:40 compute-1 nova_compute[183083]: 2026-01-26 09:24:40.859 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 09:24:42 compute-1 nova_compute[183083]: 2026-01-26 09:24:42.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:24:42 compute-1 nova_compute[183083]: 2026-01-26 09:24:42.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:24:42 compute-1 nova_compute[183083]: 2026-01-26 09:24:42.957 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:43 compute-1 ovn_controller[95352]: 2026-01-26T09:24:43Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:92:c3:07 10.100.0.3
Jan 26 09:24:43 compute-1 ovn_controller[95352]: 2026-01-26T09:24:43Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:92:c3:07 10.100.0.3
Jan 26 09:24:44 compute-1 nova_compute[183083]: 2026-01-26 09:24:44.471 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:46 compute-1 nova_compute[183083]: 2026-01-26 09:24:46.018 183087 INFO nova.compute.manager [None req-980d955b-73b3-48c3-9068-ceae35b5423b fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Get console output
Jan 26 09:24:46 compute-1 nova_compute[183083]: 2026-01-26 09:24:46.027 212375 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 26 09:24:46 compute-1 nova_compute[183083]: 2026-01-26 09:24:46.991 183087 DEBUG nova.compute.manager [req-c423e828-5c70-4918-b248-ce594713a312 req-bc9ea49f-f5e1-4ee4-b22e-96af15461308 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Received event network-changed-5aa77a60-fcf0-426b-b9e5-2ff2957856e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:24:46 compute-1 nova_compute[183083]: 2026-01-26 09:24:46.992 183087 DEBUG nova.compute.manager [req-c423e828-5c70-4918-b248-ce594713a312 req-bc9ea49f-f5e1-4ee4-b22e-96af15461308 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Refreshing instance network info cache due to event network-changed-5aa77a60-fcf0-426b-b9e5-2ff2957856e8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:24:46 compute-1 nova_compute[183083]: 2026-01-26 09:24:46.992 183087 DEBUG oslo_concurrency.lockutils [req-c423e828-5c70-4918-b248-ce594713a312 req-bc9ea49f-f5e1-4ee4-b22e-96af15461308 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-8bd1b013-4c5d-417a-ac90-fee845c3d159" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:24:46 compute-1 nova_compute[183083]: 2026-01-26 09:24:46.993 183087 DEBUG oslo_concurrency.lockutils [req-c423e828-5c70-4918-b248-ce594713a312 req-bc9ea49f-f5e1-4ee4-b22e-96af15461308 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-8bd1b013-4c5d-417a-ac90-fee845c3d159" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:24:46 compute-1 nova_compute[183083]: 2026-01-26 09:24:46.993 183087 DEBUG nova.network.neutron [req-c423e828-5c70-4918-b248-ce594713a312 req-bc9ea49f-f5e1-4ee4-b22e-96af15461308 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Refreshing network info cache for port 5aa77a60-fcf0-426b-b9e5-2ff2957856e8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:24:47 compute-1 nova_compute[183083]: 2026-01-26 09:24:47.961 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:48 compute-1 nova_compute[183083]: 2026-01-26 09:24:48.843 183087 DEBUG nova.network.neutron [req-c423e828-5c70-4918-b248-ce594713a312 req-bc9ea49f-f5e1-4ee4-b22e-96af15461308 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Updated VIF entry in instance network info cache for port 5aa77a60-fcf0-426b-b9e5-2ff2957856e8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:24:48 compute-1 nova_compute[183083]: 2026-01-26 09:24:48.844 183087 DEBUG nova.network.neutron [req-c423e828-5c70-4918-b248-ce594713a312 req-bc9ea49f-f5e1-4ee4-b22e-96af15461308 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Updating instance_info_cache with network_info: [{"id": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "address": "fa:16:3e:92:c3:07", "network": {"id": "df9227db-8d5b-40c3-bc17-cf71e8c5c2d4", "bridge": "br-int", "label": "tempest-test-network--112460421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2968fec2ed804be1b674f6246298b67a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa77a60-fc", "ovs_interfaceid": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:24:48 compute-1 nova_compute[183083]: 2026-01-26 09:24:48.870 183087 DEBUG oslo_concurrency.lockutils [req-c423e828-5c70-4918-b248-ce594713a312 req-bc9ea49f-f5e1-4ee4-b22e-96af15461308 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-8bd1b013-4c5d-417a-ac90-fee845c3d159" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:24:48 compute-1 nova_compute[183083]: 2026-01-26 09:24:48.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.004 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.005 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.006 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.006 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.121 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.213 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.215 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.302 183087 DEBUG oslo_concurrency.processutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.442 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.444 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13456MB free_disk=113.05499267578125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.444 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.444 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.473 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.625 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Instance 8bd1b013-4c5d-417a-ac90-fee845c3d159 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.625 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.625 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.679 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.697 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.735 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:24:49 compute-1 nova_compute[183083]: 2026-01-26 09:24:49.736 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:24:52 compute-1 nova_compute[183083]: 2026-01-26 09:24:52.964 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:53 compute-1 nova_compute[183083]: 2026-01-26 09:24:53.891 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:54 compute-1 nova_compute[183083]: 2026-01-26 09:24:54.476 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:55 compute-1 nova_compute[183083]: 2026-01-26 09:24:55.790 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:55 compute-1 ovn_controller[95352]: 2026-01-26T09:24:55Z|00375|binding|INFO|Releasing lport 996988ab-7aaf-41cc-9cc5-8cd9afda3702 from this chassis (sb_readonly=0)
Jan 26 09:24:55 compute-1 NetworkManager[55451]: <info>  [1769419495.7921] manager: (patch-br-int-to-provnet-149e76db-406a-40c9-b6a7-879b1da420de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Jan 26 09:24:55 compute-1 NetworkManager[55451]: <info>  [1769419495.7934] manager: (patch-provnet-149e76db-406a-40c9-b6a7-879b1da420de-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Jan 26 09:24:55 compute-1 ovn_controller[95352]: 2026-01-26T09:24:55Z|00376|binding|INFO|Releasing lport 996988ab-7aaf-41cc-9cc5-8cd9afda3702 from this chassis (sb_readonly=0)
Jan 26 09:24:55 compute-1 nova_compute[183083]: 2026-01-26 09:24:55.803 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:55 compute-1 nova_compute[183083]: 2026-01-26 09:24:55.808 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:57 compute-1 nova_compute[183083]: 2026-01-26 09:24:57.524 183087 DEBUG nova.compute.manager [req-72df401d-dce3-4af3-af9b-760118efc661 req-f86c27c4-0efe-4278-9f1f-4697d1b6818f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Received event network-changed-5aa77a60-fcf0-426b-b9e5-2ff2957856e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:24:57 compute-1 nova_compute[183083]: 2026-01-26 09:24:57.525 183087 DEBUG nova.compute.manager [req-72df401d-dce3-4af3-af9b-760118efc661 req-f86c27c4-0efe-4278-9f1f-4697d1b6818f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Refreshing instance network info cache due to event network-changed-5aa77a60-fcf0-426b-b9e5-2ff2957856e8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:24:57 compute-1 nova_compute[183083]: 2026-01-26 09:24:57.525 183087 DEBUG oslo_concurrency.lockutils [req-72df401d-dce3-4af3-af9b-760118efc661 req-f86c27c4-0efe-4278-9f1f-4697d1b6818f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-8bd1b013-4c5d-417a-ac90-fee845c3d159" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:24:57 compute-1 nova_compute[183083]: 2026-01-26 09:24:57.525 183087 DEBUG oslo_concurrency.lockutils [req-72df401d-dce3-4af3-af9b-760118efc661 req-f86c27c4-0efe-4278-9f1f-4697d1b6818f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-8bd1b013-4c5d-417a-ac90-fee845c3d159" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:24:57 compute-1 nova_compute[183083]: 2026-01-26 09:24:57.526 183087 DEBUG nova.network.neutron [req-72df401d-dce3-4af3-af9b-760118efc661 req-f86c27c4-0efe-4278-9f1f-4697d1b6818f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Refreshing network info cache for port 5aa77a60-fcf0-426b-b9e5-2ff2957856e8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:24:57 compute-1 podman[230834]: 2026-01-26 09:24:57.816259165 +0000 UTC m=+0.072463567 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 09:24:57 compute-1 podman[230835]: 2026-01-26 09:24:57.824729404 +0000 UTC m=+0.075800471 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-type=git, release=1755695350, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, architecture=x86_64)
Jan 26 09:24:57 compute-1 podman[230837]: 2026-01-26 09:24:57.827222164 +0000 UTC m=+0.070890412 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 09:24:57 compute-1 podman[230836]: 2026-01-26 09:24:57.838501172 +0000 UTC m=+0.090063064 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 26 09:24:57 compute-1 podman[230833]: 2026-01-26 09:24:57.853087754 +0000 UTC m=+0.110515271 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 26 09:24:57 compute-1 nova_compute[183083]: 2026-01-26 09:24:57.966 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:58 compute-1 nova_compute[183083]: 2026-01-26 09:24:58.908 183087 DEBUG nova.network.neutron [req-72df401d-dce3-4af3-af9b-760118efc661 req-f86c27c4-0efe-4278-9f1f-4697d1b6818f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Updated VIF entry in instance network info cache for port 5aa77a60-fcf0-426b-b9e5-2ff2957856e8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:24:58 compute-1 nova_compute[183083]: 2026-01-26 09:24:58.909 183087 DEBUG nova.network.neutron [req-72df401d-dce3-4af3-af9b-760118efc661 req-f86c27c4-0efe-4278-9f1f-4697d1b6818f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Updating instance_info_cache with network_info: [{"id": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "address": "fa:16:3e:92:c3:07", "network": {"id": "df9227db-8d5b-40c3-bc17-cf71e8c5c2d4", "bridge": "br-int", "label": "tempest-test-network--112460421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2968fec2ed804be1b674f6246298b67a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa77a60-fc", "ovs_interfaceid": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:24:58 compute-1 nova_compute[183083]: 2026-01-26 09:24:58.948 183087 DEBUG oslo_concurrency.lockutils [req-72df401d-dce3-4af3-af9b-760118efc661 req-f86c27c4-0efe-4278-9f1f-4697d1b6818f 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-8bd1b013-4c5d-417a-ac90-fee845c3d159" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.244 183087 DEBUG oslo_concurrency.lockutils [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Acquiring lock "8bd1b013-4c5d-417a-ac90-fee845c3d159" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.244 183087 DEBUG oslo_concurrency.lockutils [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lock "8bd1b013-4c5d-417a-ac90-fee845c3d159" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.245 183087 DEBUG oslo_concurrency.lockutils [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Acquiring lock "8bd1b013-4c5d-417a-ac90-fee845c3d159-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.245 183087 DEBUG oslo_concurrency.lockutils [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lock "8bd1b013-4c5d-417a-ac90-fee845c3d159-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.245 183087 DEBUG oslo_concurrency.lockutils [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lock "8bd1b013-4c5d-417a-ac90-fee845c3d159-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.246 183087 INFO nova.compute.manager [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Terminating instance
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.247 183087 DEBUG nova.compute.manager [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 09:24:59 compute-1 kernel: tap5aa77a60-fc (unregistering): left promiscuous mode
Jan 26 09:24:59 compute-1 NetworkManager[55451]: <info>  [1769419499.2791] device (tap5aa77a60-fc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 09:24:59 compute-1 ovn_controller[95352]: 2026-01-26T09:24:59Z|00377|binding|INFO|Releasing lport 5aa77a60-fcf0-426b-b9e5-2ff2957856e8 from this chassis (sb_readonly=0)
Jan 26 09:24:59 compute-1 ovn_controller[95352]: 2026-01-26T09:24:59Z|00378|binding|INFO|Setting lport 5aa77a60-fcf0-426b-b9e5-2ff2957856e8 down in Southbound
Jan 26 09:24:59 compute-1 ovn_controller[95352]: 2026-01-26T09:24:59Z|00379|binding|INFO|Removing iface tap5aa77a60-fc ovn-installed in OVS
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.335 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.346 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:59 compute-1 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000003e.scope: Deactivated successfully.
Jan 26 09:24:59 compute-1 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000003e.scope: Consumed 13.334s CPU time.
Jan 26 09:24:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:59.368 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:c3:07 10.100.0.3', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8bd1b013-4c5d-417a-ac90-fee845c3d159', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2968fec2ed804be1b674f6246298b67a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.232'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94f8421d-595b-4d62-815b-fdcc7cbf6286, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>], logical_port=5aa77a60-fcf0-426b-b9e5-2ff2957856e8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f35a6f97f10>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:24:59 compute-1 systemd-machined[154360]: Machine qemu-22-instance-0000003e terminated.
Jan 26 09:24:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:59.370 104632 INFO neutron.agent.ovn.metadata.agent [-] Port 5aa77a60-fcf0-426b-b9e5-2ff2957856e8 in datapath df9227db-8d5b-40c3-bc17-cf71e8c5c2d4 unbound from our chassis
Jan 26 09:24:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:59.372 104632 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network df9227db-8d5b-40c3-bc17-cf71e8c5c2d4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 26 09:24:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:59.374 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[c025263d-2169-4bc3-962e-68f8a7addd27]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:59.375 104632 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4 namespace which is not needed anymore
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.480 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.531 183087 INFO nova.virt.libvirt.driver [-] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Instance destroyed successfully.
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.532 183087 DEBUG nova.objects.instance [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lazy-loading 'resources' on Instance uuid 8bd1b013-4c5d-417a-ac90-fee845c3d159 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 26 09:24:59 compute-1 neutron-haproxy-ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4[230762]: [NOTICE]   (230766) : haproxy version is 2.8.14-c23fe91
Jan 26 09:24:59 compute-1 neutron-haproxy-ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4[230762]: [NOTICE]   (230766) : path to executable is /usr/sbin/haproxy
Jan 26 09:24:59 compute-1 neutron-haproxy-ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4[230762]: [WARNING]  (230766) : Exiting Master process...
Jan 26 09:24:59 compute-1 neutron-haproxy-ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4[230762]: [ALERT]    (230766) : Current worker (230768) exited with code 143 (Terminated)
Jan 26 09:24:59 compute-1 neutron-haproxy-ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4[230762]: [WARNING]  (230766) : All workers exited. Exiting... (0)
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.551 183087 DEBUG nova.virt.libvirt.vif [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T09:24:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-591621980',display_name='tempest-server-test-591621980',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-591621980',id=62,image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ0In/4xMyfKShlxQSlxjG12nbv2C0HX1XSV6OVVsf7TFR49jvQjBw/toy9vczCc/4qWnGwOBcP7ZM2MCPR8MzRrcEwdBYas24+LrNAh4L4utPPYRBQXISG/+xLpH9Pyrw==',key_name='tempest-keypair-test-314781837',keypairs=<?>,launch_index=0,launched_at=2026-01-26T09:24:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2968fec2ed804be1b674f6246298b67a',ramdisk_id='',reservation_id='r-xqlu70x3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='13d1a20a-8003-4f19-aba7-ccbd9eff9b82',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnFdbAgingTest-1512062367',owner_user_name='tempest-OvnFdbAgingTest-1512062367-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T09:24:29Z,user_data=None,user_id='fea5fdfe49874e138b7b4a69a915e73e',uuid=8bd1b013-4c5d-417a-ac90-fee845c3d159,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "address": "fa:16:3e:92:c3:07", "network": {"id": "df9227db-8d5b-40c3-bc17-cf71e8c5c2d4", "bridge": "br-int", "label": "tempest-test-network--112460421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2968fec2ed804be1b674f6246298b67a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa77a60-fc", "ovs_interfaceid": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.552 183087 DEBUG nova.network.os_vif_util [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Converting VIF {"id": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "address": "fa:16:3e:92:c3:07", "network": {"id": "df9227db-8d5b-40c3-bc17-cf71e8c5c2d4", "bridge": "br-int", "label": "tempest-test-network--112460421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2968fec2ed804be1b674f6246298b67a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa77a60-fc", "ovs_interfaceid": "5aa77a60-fcf0-426b-b9e5-2ff2957856e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.552 183087 DEBUG nova.network.os_vif_util [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:92:c3:07,bridge_name='br-int',has_traffic_filtering=True,id=5aa77a60-fcf0-426b-b9e5-2ff2957856e8,network=Network(df9227db-8d5b-40c3-bc17-cf71e8c5c2d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa77a60-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.553 183087 DEBUG os_vif [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:92:c3:07,bridge_name='br-int',has_traffic_filtering=True,id=5aa77a60-fcf0-426b-b9e5-2ff2957856e8,network=Network(df9227db-8d5b-40c3-bc17-cf71e8c5c2d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa77a60-fc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 09:24:59 compute-1 systemd[1]: libpod-3009534bdeabf5a3e608ca03551dcd217dee5a1933ff0c1107b500d2351b5cb2.scope: Deactivated successfully.
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.555 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.555 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5aa77a60-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.556 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.558 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:59 compute-1 podman[230959]: 2026-01-26 09:24:59.561056059 +0000 UTC m=+0.072738975 container died 3009534bdeabf5a3e608ca03551dcd217dee5a1933ff0c1107b500d2351b5cb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.562 183087 INFO os_vif [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:92:c3:07,bridge_name='br-int',has_traffic_filtering=True,id=5aa77a60-fcf0-426b-b9e5-2ff2957856e8,network=Network(df9227db-8d5b-40c3-bc17-cf71e8c5c2d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa77a60-fc')
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.563 183087 INFO nova.virt.libvirt.driver [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Deleting instance files /var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159_del
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.564 183087 INFO nova.virt.libvirt.driver [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Deletion of /var/lib/nova/instances/8bd1b013-4c5d-417a-ac90-fee845c3d159_del complete
Jan 26 09:24:59 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3009534bdeabf5a3e608ca03551dcd217dee5a1933ff0c1107b500d2351b5cb2-userdata-shm.mount: Deactivated successfully.
Jan 26 09:24:59 compute-1 systemd[1]: var-lib-containers-storage-overlay-7de1c7d8da66dd848730ac41207649d81cef879ab2099857f9ce3df7cbaf19ac-merged.mount: Deactivated successfully.
Jan 26 09:24:59 compute-1 podman[230959]: 2026-01-26 09:24:59.602594682 +0000 UTC m=+0.114277578 container cleanup 3009534bdeabf5a3e608ca03551dcd217dee5a1933ff0c1107b500d2351b5cb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 26 09:24:59 compute-1 systemd[1]: libpod-conmon-3009534bdeabf5a3e608ca03551dcd217dee5a1933ff0c1107b500d2351b5cb2.scope: Deactivated successfully.
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.625 183087 INFO nova.compute.manager [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.626 183087 DEBUG oslo.service.loopingcall [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.626 183087 DEBUG nova.compute.manager [-] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.626 183087 DEBUG nova.network.neutron [-] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 09:24:59 compute-1 podman[231001]: 2026-01-26 09:24:59.688292062 +0000 UTC m=+0.057678459 container remove 3009534bdeabf5a3e608ca03551dcd217dee5a1933ff0c1107b500d2351b5cb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 26 09:24:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:59.695 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[bf99d1d0-d644-4d91-a1be-6dd89a27614f]: (4, ('Mon Jan 26 09:24:59 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4 (3009534bdeabf5a3e608ca03551dcd217dee5a1933ff0c1107b500d2351b5cb2)\n3009534bdeabf5a3e608ca03551dcd217dee5a1933ff0c1107b500d2351b5cb2\nMon Jan 26 09:24:59 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4 (3009534bdeabf5a3e608ca03551dcd217dee5a1933ff0c1107b500d2351b5cb2)\n3009534bdeabf5a3e608ca03551dcd217dee5a1933ff0c1107b500d2351b5cb2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:59.696 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[61833d95-1498-428e-8614-51e620826690]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:59.697 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf9227db-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.699 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:59 compute-1 kernel: tapdf9227db-80: left promiscuous mode
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.711 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:59 compute-1 nova_compute[183083]: 2026-01-26 09:24:59.713 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:24:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:59.716 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[420415fa-e41d-46df-b194-000866c31599]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:59.737 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[b2f7c031-519c-490f-943f-4c4515b53a6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:59.738 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[55498b1e-7c8f-4df0-860c-846a8aa382d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:59.757 212483 DEBUG oslo.privsep.daemon [-] privsep: reply[76ec43d5-736b-4e12-8f49-099215d40ba0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580970, 'reachable_time': 26841, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231016, 'error': None, 'target': 'ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:24:59 compute-1 systemd[1]: run-netns-ovnmeta\x2ddf9227db\x2d8d5b\x2d40c3\x2dbc17\x2dcf71e8c5c2d4.mount: Deactivated successfully.
Jan 26 09:24:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:59.762 105024 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-df9227db-8d5b-40c3-bc17-cf71e8c5c2d4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 26 09:24:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:24:59.762 105024 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d7db95-d73c-41eb-823e-1d84961928b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 26 09:25:00 compute-1 nova_compute[183083]: 2026-01-26 09:25:00.660 183087 DEBUG nova.compute.manager [req-cb86165d-a611-4f2f-ab20-e0972ff5ec02 req-edac6f47-012f-4923-b4a5-612f843bf6f8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Received event network-vif-unplugged-5aa77a60-fcf0-426b-b9e5-2ff2957856e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:25:00 compute-1 nova_compute[183083]: 2026-01-26 09:25:00.661 183087 DEBUG oslo_concurrency.lockutils [req-cb86165d-a611-4f2f-ab20-e0972ff5ec02 req-edac6f47-012f-4923-b4a5-612f843bf6f8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "8bd1b013-4c5d-417a-ac90-fee845c3d159-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:25:00 compute-1 nova_compute[183083]: 2026-01-26 09:25:00.661 183087 DEBUG oslo_concurrency.lockutils [req-cb86165d-a611-4f2f-ab20-e0972ff5ec02 req-edac6f47-012f-4923-b4a5-612f843bf6f8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "8bd1b013-4c5d-417a-ac90-fee845c3d159-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:25:00 compute-1 nova_compute[183083]: 2026-01-26 09:25:00.662 183087 DEBUG oslo_concurrency.lockutils [req-cb86165d-a611-4f2f-ab20-e0972ff5ec02 req-edac6f47-012f-4923-b4a5-612f843bf6f8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "8bd1b013-4c5d-417a-ac90-fee845c3d159-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:25:00 compute-1 nova_compute[183083]: 2026-01-26 09:25:00.662 183087 DEBUG nova.compute.manager [req-cb86165d-a611-4f2f-ab20-e0972ff5ec02 req-edac6f47-012f-4923-b4a5-612f843bf6f8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] No waiting events found dispatching network-vif-unplugged-5aa77a60-fcf0-426b-b9e5-2ff2957856e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:25:00 compute-1 nova_compute[183083]: 2026-01-26 09:25:00.662 183087 DEBUG nova.compute.manager [req-cb86165d-a611-4f2f-ab20-e0972ff5ec02 req-edac6f47-012f-4923-b4a5-612f843bf6f8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Received event network-vif-unplugged-5aa77a60-fcf0-426b-b9e5-2ff2957856e8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 26 09:25:01 compute-1 nova_compute[183083]: 2026-01-26 09:25:01.692 183087 DEBUG nova.network.neutron [-] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:25:01 compute-1 nova_compute[183083]: 2026-01-26 09:25:01.708 183087 INFO nova.compute.manager [-] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Took 2.08 seconds to deallocate network for instance.
Jan 26 09:25:01 compute-1 nova_compute[183083]: 2026-01-26 09:25:01.816 183087 DEBUG oslo_concurrency.lockutils [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:25:01 compute-1 nova_compute[183083]: 2026-01-26 09:25:01.817 183087 DEBUG oslo_concurrency.lockutils [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:25:01 compute-1 nova_compute[183083]: 2026-01-26 09:25:01.873 183087 DEBUG nova.compute.provider_tree [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:25:01 compute-1 nova_compute[183083]: 2026-01-26 09:25:01.886 183087 DEBUG nova.scheduler.client.report [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:25:01 compute-1 nova_compute[183083]: 2026-01-26 09:25:01.912 183087 DEBUG oslo_concurrency.lockutils [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:25:01 compute-1 nova_compute[183083]: 2026-01-26 09:25:01.950 183087 INFO nova.scheduler.client.report [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Deleted allocations for instance 8bd1b013-4c5d-417a-ac90-fee845c3d159
Jan 26 09:25:02 compute-1 nova_compute[183083]: 2026-01-26 09:25:02.019 183087 DEBUG oslo_concurrency.lockutils [None req-fcc160dd-6f7b-4b69-b718-47a4385a5df7 fea5fdfe49874e138b7b4a69a915e73e 2968fec2ed804be1b674f6246298b67a - - default default] Lock "8bd1b013-4c5d-417a-ac90-fee845c3d159" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.774s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:25:02 compute-1 nova_compute[183083]: 2026-01-26 09:25:02.732 183087 DEBUG nova.compute.manager [req-aa45e1dc-5e90-4e26-a8b7-11023ca8f5e3 req-e16f6071-dd15-455a-89cb-37b594bed26a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Received event network-vif-plugged-5aa77a60-fcf0-426b-b9e5-2ff2957856e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:25:02 compute-1 nova_compute[183083]: 2026-01-26 09:25:02.732 183087 DEBUG oslo_concurrency.lockutils [req-aa45e1dc-5e90-4e26-a8b7-11023ca8f5e3 req-e16f6071-dd15-455a-89cb-37b594bed26a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "8bd1b013-4c5d-417a-ac90-fee845c3d159-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:25:02 compute-1 nova_compute[183083]: 2026-01-26 09:25:02.733 183087 DEBUG oslo_concurrency.lockutils [req-aa45e1dc-5e90-4e26-a8b7-11023ca8f5e3 req-e16f6071-dd15-455a-89cb-37b594bed26a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "8bd1b013-4c5d-417a-ac90-fee845c3d159-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:25:02 compute-1 nova_compute[183083]: 2026-01-26 09:25:02.733 183087 DEBUG oslo_concurrency.lockutils [req-aa45e1dc-5e90-4e26-a8b7-11023ca8f5e3 req-e16f6071-dd15-455a-89cb-37b594bed26a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Lock "8bd1b013-4c5d-417a-ac90-fee845c3d159-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:25:02 compute-1 nova_compute[183083]: 2026-01-26 09:25:02.733 183087 DEBUG nova.compute.manager [req-aa45e1dc-5e90-4e26-a8b7-11023ca8f5e3 req-e16f6071-dd15-455a-89cb-37b594bed26a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] No waiting events found dispatching network-vif-plugged-5aa77a60-fcf0-426b-b9e5-2ff2957856e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 26 09:25:02 compute-1 nova_compute[183083]: 2026-01-26 09:25:02.733 183087 WARNING nova.compute.manager [req-aa45e1dc-5e90-4e26-a8b7-11023ca8f5e3 req-e16f6071-dd15-455a-89cb-37b594bed26a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Received unexpected event network-vif-plugged-5aa77a60-fcf0-426b-b9e5-2ff2957856e8 for instance with vm_state deleted and task_state None.
Jan 26 09:25:02 compute-1 nova_compute[183083]: 2026-01-26 09:25:02.733 183087 DEBUG nova.compute.manager [req-aa45e1dc-5e90-4e26-a8b7-11023ca8f5e3 req-e16f6071-dd15-455a-89cb-37b594bed26a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Received event network-vif-deleted-5aa77a60-fcf0-426b-b9e5-2ff2957856e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:25:04 compute-1 nova_compute[183083]: 2026-01-26 09:25:04.515 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:25:04 compute-1 nova_compute[183083]: 2026-01-26 09:25:04.557 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:25:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:25:05.343 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:25:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:25:05.344 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:25:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:25:05.344 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:25:08 compute-1 ovn_controller[95352]: 2026-01-26T09:25:08Z|00380|pinctrl|WARN|Dropped 565 log messages in last 64 seconds (most recently, 7 seconds ago) due to excessive rate
Jan 26 09:25:08 compute-1 ovn_controller[95352]: 2026-01-26T09:25:08Z|00381|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:25:09 compute-1 nova_compute[183083]: 2026-01-26 09:25:09.519 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:25:09 compute-1 nova_compute[183083]: 2026-01-26 09:25:09.559 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:25:11 compute-1 podman[231017]: 2026-01-26 09:25:11.813405547 +0000 UTC m=+0.074011561 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 09:25:13 compute-1 sshd-session[231041]: Connection closed by authenticating user root 178.62.249.31 port 39510 [preauth]
Jan 26 09:25:13 compute-1 nova_compute[183083]: 2026-01-26 09:25:13.585 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:25:13 compute-1 nova_compute[183083]: 2026-01-26 09:25:13.657 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:25:14 compute-1 nova_compute[183083]: 2026-01-26 09:25:14.530 183087 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769419499.5285132, 8bd1b013-4c5d-417a-ac90-fee845c3d159 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 26 09:25:14 compute-1 nova_compute[183083]: 2026-01-26 09:25:14.530 183087 INFO nova.compute.manager [-] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] VM Stopped (Lifecycle Event)
Jan 26 09:25:14 compute-1 nova_compute[183083]: 2026-01-26 09:25:14.571 183087 DEBUG nova.compute.manager [None req-ed7537aa-7207-4118-bccb-dac62ec77224 - - - - - -] [instance: 8bd1b013-4c5d-417a-ac90-fee845c3d159] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 26 09:25:14 compute-1 nova_compute[183083]: 2026-01-26 09:25:14.572 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:25:17 compute-1 nova_compute[183083]: 2026-01-26 09:25:17.526 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:25:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:25:17.526 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:25:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:25:17.528 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:25:19 compute-1 nova_compute[183083]: 2026-01-26 09:25:19.631 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:25:24 compute-1 nova_compute[183083]: 2026-01-26 09:25:24.633 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:25:25 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:25:25.531 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:25:28 compute-1 podman[231047]: 2026-01-26 09:25:28.810997774 +0000 UTC m=+0.063153425 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 26 09:25:28 compute-1 podman[231046]: 2026-01-26 09:25:28.8222036 +0000 UTC m=+0.077279904 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_id=openstack_network_exporter, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, vcs-type=git)
Jan 26 09:25:28 compute-1 podman[231053]: 2026-01-26 09:25:28.82538405 +0000 UTC m=+0.071770168 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 09:25:28 compute-1 podman[231045]: 2026-01-26 09:25:28.851127527 +0000 UTC m=+0.102961339 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 26 09:25:28 compute-1 podman[231044]: 2026-01-26 09:25:28.885968101 +0000 UTC m=+0.148936448 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:25:29 compute-1 nova_compute[183083]: 2026-01-26 09:25:29.635 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:25:29 compute-1 nova_compute[183083]: 2026-01-26 09:25:29.637 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:25:29 compute-1 nova_compute[183083]: 2026-01-26 09:25:29.637 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:25:29 compute-1 nova_compute[183083]: 2026-01-26 09:25:29.637 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:25:29 compute-1 nova_compute[183083]: 2026-01-26 09:25:29.669 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:25:29 compute-1 nova_compute[183083]: 2026-01-26 09:25:29.670 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:25:31 compute-1 nova_compute[183083]: 2026-01-26 09:25:31.737 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:25:31 compute-1 nova_compute[183083]: 2026-01-26 09:25:31.737 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:25:31 compute-1 nova_compute[183083]: 2026-01-26 09:25:31.738 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:25:31 compute-1 nova_compute[183083]: 2026-01-26 09:25:31.752 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:25:33 compute-1 nova_compute[183083]: 2026-01-26 09:25:33.961 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:25:34 compute-1 nova_compute[183083]: 2026-01-26 09:25:34.672 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:25:34 compute-1 nova_compute[183083]: 2026-01-26 09:25:34.674 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:25:34 compute-1 nova_compute[183083]: 2026-01-26 09:25:34.674 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:25:34 compute-1 nova_compute[183083]: 2026-01-26 09:25:34.674 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:25:34 compute-1 nova_compute[183083]: 2026-01-26 09:25:34.719 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:25:34 compute-1 nova_compute[183083]: 2026-01-26 09:25:34.720 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:25:34 compute-1 nova_compute[183083]: 2026-01-26 09:25:34.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:25:37 compute-1 nova_compute[183083]: 2026-01-26 09:25:37.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:25:37 compute-1 nova_compute[183083]: 2026-01-26 09:25:37.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:25:38 compute-1 nova_compute[183083]: 2026-01-26 09:25:38.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:25:39 compute-1 nova_compute[183083]: 2026-01-26 09:25:39.720 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:25:39 compute-1 sshd-session[231151]: Accepted publickey for zuul from 38.102.83.66 port 47606 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:25:39 compute-1 systemd-logind[788]: New session 143 of user zuul.
Jan 26 09:25:39 compute-1 systemd[1]: Started Session 143 of User zuul.
Jan 26 09:25:39 compute-1 sshd-session[231151]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:25:39 compute-1 nova_compute[183083]: 2026-01-26 09:25:39.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:25:40 compute-1 sshd-session[231154]: Connection closed by 38.102.83.66 port 47606
Jan 26 09:25:40 compute-1 sshd-session[231151]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:25:40 compute-1 systemd[1]: session-143.scope: Deactivated successfully.
Jan 26 09:25:40 compute-1 systemd-logind[788]: Session 143 logged out. Waiting for processes to exit.
Jan 26 09:25:40 compute-1 systemd-logind[788]: Removed session 143.
Jan 26 09:25:42 compute-1 podman[231178]: 2026-01-26 09:25:42.812956156 +0000 UTC m=+0.070049300 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 09:25:42 compute-1 nova_compute[183083]: 2026-01-26 09:25:42.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:25:42 compute-1 nova_compute[183083]: 2026-01-26 09:25:42.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:25:43 compute-1 nova_compute[183083]: 2026-01-26 09:25:43.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:25:44 compute-1 ovn_controller[95352]: 2026-01-26T09:25:44Z|00382|memory_trim|INFO|Detected inactivity (last active 30016 ms ago): trimming memory
Jan 26 09:25:44 compute-1 nova_compute[183083]: 2026-01-26 09:25:44.722 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:25:44 compute-1 nova_compute[183083]: 2026-01-26 09:25:44.724 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:25:44 compute-1 nova_compute[183083]: 2026-01-26 09:25:44.724 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:25:44 compute-1 nova_compute[183083]: 2026-01-26 09:25:44.724 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:25:44 compute-1 nova_compute[183083]: 2026-01-26 09:25:44.763 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:25:44 compute-1 nova_compute[183083]: 2026-01-26 09:25:44.764 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:25:49 compute-1 nova_compute[183083]: 2026-01-26 09:25:49.765 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:25:49 compute-1 nova_compute[183083]: 2026-01-26 09:25:49.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:25:49 compute-1 nova_compute[183083]: 2026-01-26 09:25:49.979 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:25:49 compute-1 nova_compute[183083]: 2026-01-26 09:25:49.980 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:25:49 compute-1 nova_compute[183083]: 2026-01-26 09:25:49.980 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:25:49 compute-1 nova_compute[183083]: 2026-01-26 09:25:49.980 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:25:50 compute-1 nova_compute[183083]: 2026-01-26 09:25:50.182 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:25:50 compute-1 nova_compute[183083]: 2026-01-26 09:25:50.183 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13620MB free_disk=113.0837287902832GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:25:50 compute-1 nova_compute[183083]: 2026-01-26 09:25:50.183 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:25:50 compute-1 nova_compute[183083]: 2026-01-26 09:25:50.183 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:25:50 compute-1 nova_compute[183083]: 2026-01-26 09:25:50.247 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:25:50 compute-1 nova_compute[183083]: 2026-01-26 09:25:50.248 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:25:50 compute-1 nova_compute[183083]: 2026-01-26 09:25:50.340 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing inventories for resource provider 5203935e-446c-4e03-93fa-4c60d651e045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 26 09:25:50 compute-1 nova_compute[183083]: 2026-01-26 09:25:50.356 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating ProviderTree inventory for provider 5203935e-446c-4e03-93fa-4c60d651e045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 26 09:25:50 compute-1 nova_compute[183083]: 2026-01-26 09:25:50.356 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating inventory in ProviderTree for provider 5203935e-446c-4e03-93fa-4c60d651e045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 09:25:50 compute-1 nova_compute[183083]: 2026-01-26 09:25:50.371 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing aggregate associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 26 09:25:50 compute-1 nova_compute[183083]: 2026-01-26 09:25:50.395 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing trait associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 26 09:25:50 compute-1 nova_compute[183083]: 2026-01-26 09:25:50.418 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:25:50 compute-1 nova_compute[183083]: 2026-01-26 09:25:50.435 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:25:50 compute-1 nova_compute[183083]: 2026-01-26 09:25:50.455 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:25:50 compute-1 nova_compute[183083]: 2026-01-26 09:25:50.455 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:25:54 compute-1 nova_compute[183083]: 2026-01-26 09:25:54.767 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:25:59 compute-1 sshd-session[231205]: Invalid user sol from 2.57.122.238 port 47162
Jan 26 09:25:59 compute-1 podman[231214]: 2026-01-26 09:25:59.282635069 +0000 UTC m=+0.063285749 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 09:25:59 compute-1 podman[231211]: 2026-01-26 09:25:59.283665798 +0000 UTC m=+0.074080994 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, version=9.6, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 26 09:25:59 compute-1 podman[231209]: 2026-01-26 09:25:59.29966321 +0000 UTC m=+0.096082295 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 09:25:59 compute-1 podman[231210]: 2026-01-26 09:25:59.306619607 +0000 UTC m=+0.101144789 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 26 09:25:59 compute-1 podman[231212]: 2026-01-26 09:25:59.307188793 +0000 UTC m=+0.094176822 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 09:25:59 compute-1 sshd-session[231205]: Connection closed by invalid user sol 2.57.122.238 port 47162 [preauth]
Jan 26 09:25:59 compute-1 sshd-session[231207]: Connection closed by authenticating user root 178.62.249.31 port 51252 [preauth]
Jan 26 09:25:59 compute-1 nova_compute[183083]: 2026-01-26 09:25:59.768 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:25:59 compute-1 nova_compute[183083]: 2026-01-26 09:25:59.769 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:25:59 compute-1 nova_compute[183083]: 2026-01-26 09:25:59.769 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:25:59 compute-1 nova_compute[183083]: 2026-01-26 09:25:59.769 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:25:59 compute-1 nova_compute[183083]: 2026-01-26 09:25:59.769 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:25:59 compute-1 nova_compute[183083]: 2026-01-26 09:25:59.770 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:26:03 compute-1 ovn_controller[95352]: 2026-01-26T09:26:03Z|00383|pinctrl|WARN|Dropped 243 log messages in last 55 seconds (most recently, 3 seconds ago) due to excessive rate
Jan 26 09:26:03 compute-1 ovn_controller[95352]: 2026-01-26T09:26:03Z|00384|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:26:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:26:04 compute-1 nova_compute[183083]: 2026-01-26 09:26:04.772 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:26:04 compute-1 nova_compute[183083]: 2026-01-26 09:26:04.774 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:26:04 compute-1 nova_compute[183083]: 2026-01-26 09:26:04.774 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:26:04 compute-1 nova_compute[183083]: 2026-01-26 09:26:04.774 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:26:04 compute-1 nova_compute[183083]: 2026-01-26 09:26:04.830 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:26:04 compute-1 nova_compute[183083]: 2026-01-26 09:26:04.831 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:26:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:26:05.345 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:26:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:26:05.345 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:26:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:26:05.345 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:26:09 compute-1 nova_compute[183083]: 2026-01-26 09:26:09.832 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:26:09 compute-1 nova_compute[183083]: 2026-01-26 09:26:09.833 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:26:13 compute-1 podman[231310]: 2026-01-26 09:26:13.838784359 +0000 UTC m=+0.101870229 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.583 183087 DEBUG oslo_concurrency.lockutils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Acquiring lock "dfe86286-741f-4a72-9abb-35bbc3454093" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.583 183087 DEBUG oslo_concurrency.lockutils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "dfe86286-741f-4a72-9abb-35bbc3454093" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.599 183087 DEBUG nova.compute.manager [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.678 183087 DEBUG oslo_concurrency.lockutils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.678 183087 DEBUG oslo_concurrency.lockutils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.696 183087 DEBUG nova.virt.hardware [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.696 183087 INFO nova.compute.claims [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Claim successful on node compute-1.ctlplane.example.com
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.803 183087 DEBUG nova.compute.provider_tree [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.816 183087 DEBUG nova.scheduler.client.report [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.834 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.836 183087 DEBUG oslo_concurrency.lockutils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.837 183087 DEBUG nova.compute.manager [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.840 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.840 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.840 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.878 183087 DEBUG nova.compute.manager [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.878 183087 DEBUG nova.network.neutron [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.889 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.889 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.896 183087 INFO nova.virt.libvirt.driver [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.910 183087 DEBUG nova.compute.manager [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.993 183087 DEBUG nova.compute.manager [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.994 183087 DEBUG nova.virt.libvirt.driver [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.995 183087 INFO nova.virt.libvirt.driver [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Creating image(s)
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.995 183087 DEBUG oslo_concurrency.lockutils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Acquiring lock "/var/lib/nova/instances/dfe86286-741f-4a72-9abb-35bbc3454093/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.996 183087 DEBUG oslo_concurrency.lockutils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "/var/lib/nova/instances/dfe86286-741f-4a72-9abb-35bbc3454093/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.996 183087 DEBUG oslo_concurrency.lockutils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "/var/lib/nova/instances/dfe86286-741f-4a72-9abb-35bbc3454093/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.997 183087 DEBUG oslo_concurrency.lockutils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:26:14 compute-1 nova_compute[183083]: 2026-01-26 09:26:14.997 183087 DEBUG oslo_concurrency.lockutils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.536 183087 DEBUG nova.policy [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1a420e5f1acf4fc69a9a572e5dfb0a21', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '28a4e49cff08498b810bd1f2903b2946', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 DEBUG oslo_concurrency.lockutils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.897s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Traceback (most recent call last):
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]     raise exception.ImageUnacceptable(
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093] 
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093] During handling of the above exception, another exception occurred:
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093] 
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Traceback (most recent call last):
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]     yield resources
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]     self.driver.spawn(context, instance, image_meta,
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]     created_instance_dir, created_disks = self._create_image(
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]     created_disks = self._create_and_inject_local_root(
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]     image.cache(fetch_func=fetch_func,
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]     self.create_image(fetch_func_sync, base, size,
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]     prepare_template(target=base, *args, **kwargs)
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]     return f(*args, **kwargs)
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]     fetch_func(target=target, *args, **kwargs)
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093]     raise exception.ImageUnacceptable(
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 09:26:15 compute-1 nova_compute[183083]: 2026-01-26 09:26:15.895 183087 ERROR nova.compute.manager [instance: dfe86286-741f-4a72-9abb-35bbc3454093] 
Jan 26 09:26:16 compute-1 nova_compute[183083]: 2026-01-26 09:26:16.754 183087 DEBUG nova.network.neutron [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Successfully created port: dce1ac8b-2ad6-456f-87a5-5a9c73c7420c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 09:26:17 compute-1 nova_compute[183083]: 2026-01-26 09:26:17.501 183087 DEBUG nova.network.neutron [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Successfully updated port: dce1ac8b-2ad6-456f-87a5-5a9c73c7420c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 09:26:17 compute-1 nova_compute[183083]: 2026-01-26 09:26:17.514 183087 DEBUG oslo_concurrency.lockutils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Acquiring lock "refresh_cache-dfe86286-741f-4a72-9abb-35bbc3454093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:26:17 compute-1 nova_compute[183083]: 2026-01-26 09:26:17.514 183087 DEBUG oslo_concurrency.lockutils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Acquired lock "refresh_cache-dfe86286-741f-4a72-9abb-35bbc3454093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:26:17 compute-1 nova_compute[183083]: 2026-01-26 09:26:17.514 183087 DEBUG nova.network.neutron [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 09:26:17 compute-1 nova_compute[183083]: 2026-01-26 09:26:17.618 183087 DEBUG nova.compute.manager [req-4ed60542-df17-4f38-a761-3761dce0a37a req-aea95f75-1283-405c-b27b-b1f84577235a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Received event network-changed-dce1ac8b-2ad6-456f-87a5-5a9c73c7420c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:26:17 compute-1 nova_compute[183083]: 2026-01-26 09:26:17.618 183087 DEBUG nova.compute.manager [req-4ed60542-df17-4f38-a761-3761dce0a37a req-aea95f75-1283-405c-b27b-b1f84577235a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Refreshing instance network info cache due to event network-changed-dce1ac8b-2ad6-456f-87a5-5a9c73c7420c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:26:17 compute-1 nova_compute[183083]: 2026-01-26 09:26:17.619 183087 DEBUG oslo_concurrency.lockutils [req-4ed60542-df17-4f38-a761-3761dce0a37a req-aea95f75-1283-405c-b27b-b1f84577235a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-dfe86286-741f-4a72-9abb-35bbc3454093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:26:17 compute-1 nova_compute[183083]: 2026-01-26 09:26:17.674 183087 DEBUG nova.network.neutron [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.682 183087 DEBUG nova.network.neutron [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Updating instance_info_cache with network_info: [{"id": "dce1ac8b-2ad6-456f-87a5-5a9c73c7420c", "address": "fa:16:3e:c8:b1:8b", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdce1ac8b-2a", "ovs_interfaceid": "dce1ac8b-2ad6-456f-87a5-5a9c73c7420c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.712 183087 DEBUG oslo_concurrency.lockutils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Releasing lock "refresh_cache-dfe86286-741f-4a72-9abb-35bbc3454093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.712 183087 DEBUG nova.compute.manager [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Instance network_info: |[{"id": "dce1ac8b-2ad6-456f-87a5-5a9c73c7420c", "address": "fa:16:3e:c8:b1:8b", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdce1ac8b-2a", "ovs_interfaceid": "dce1ac8b-2ad6-456f-87a5-5a9c73c7420c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.713 183087 DEBUG oslo_concurrency.lockutils [req-4ed60542-df17-4f38-a761-3761dce0a37a req-aea95f75-1283-405c-b27b-b1f84577235a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-dfe86286-741f-4a72-9abb-35bbc3454093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.713 183087 DEBUG nova.network.neutron [req-4ed60542-df17-4f38-a761-3761dce0a37a req-aea95f75-1283-405c-b27b-b1f84577235a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Refreshing network info cache for port dce1ac8b-2ad6-456f-87a5-5a9c73c7420c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.714 183087 INFO nova.compute.manager [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Terminating instance
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.716 183087 DEBUG nova.compute.manager [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.721 183087 DEBUG nova.virt.libvirt.driver [-] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.722 183087 INFO nova.virt.libvirt.driver [-] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Instance destroyed successfully.
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.723 183087 DEBUG nova.virt.libvirt.vif [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T09:26:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_marking_external_network-954045707',display_name='tempest-test_dscp_marking_external_network-954045707',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-marking-external-network-954045707',id=63,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHTnrywdQRERVrs/dmHTEpw5EEhKGncubXffvqoRqz1AvwjeIxkDVy5XUi6xE+e04GsW/1jy7/I4SIb+K6npW5RUNJSHtJQNKzd4jiDcVouw0QfYx0NnKq6j4e2NuVuzoQ==',key_name='tempest-keypair-test-898140873',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='28a4e49cff08498b810bd1f2903b2946',ramdisk_id='',reservation_id='r-666frx6i',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-201278879',owner_user_name='tempest-QosTestCommon-201278879-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:26:14Z,user_data=None,user_id='1a420e5f1acf4fc69a9a572e5dfb0a21',uuid=dfe86286-741f-4a72-9abb-35bbc3454093,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dce1ac8b-2ad6-456f-87a5-5a9c73c7420c", "address": "fa:16:3e:c8:b1:8b", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdce1ac8b-2a", "ovs_interfaceid": "dce1ac8b-2ad6-456f-87a5-5a9c73c7420c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.723 183087 DEBUG nova.network.os_vif_util [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Converting VIF {"id": "dce1ac8b-2ad6-456f-87a5-5a9c73c7420c", "address": "fa:16:3e:c8:b1:8b", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdce1ac8b-2a", "ovs_interfaceid": "dce1ac8b-2ad6-456f-87a5-5a9c73c7420c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.724 183087 DEBUG nova.network.os_vif_util [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:b1:8b,bridge_name='br-int',has_traffic_filtering=True,id=dce1ac8b-2ad6-456f-87a5-5a9c73c7420c,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdce1ac8b-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.725 183087 DEBUG os_vif [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:b1:8b,bridge_name='br-int',has_traffic_filtering=True,id=dce1ac8b-2ad6-456f-87a5-5a9c73c7420c,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdce1ac8b-2a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.727 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.727 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdce1ac8b-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.728 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.730 183087 INFO os_vif [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:b1:8b,bridge_name='br-int',has_traffic_filtering=True,id=dce1ac8b-2ad6-456f-87a5-5a9c73c7420c,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdce1ac8b-2a')
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.731 183087 INFO nova.virt.libvirt.driver [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Deleting instance files /var/lib/nova/instances/dfe86286-741f-4a72-9abb-35bbc3454093_del
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.731 183087 INFO nova.virt.libvirt.driver [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Deletion of /var/lib/nova/instances/dfe86286-741f-4a72-9abb-35bbc3454093_del complete
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.887 183087 INFO nova.compute.manager [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Took 0.17 seconds to destroy the instance on the hypervisor.
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.889 183087 DEBUG nova.compute.claims [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Aborting claim: <nova.compute.claims.Claim object at 0x7f6cb87186a0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.890 183087 DEBUG oslo_concurrency.lockutils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:26:18 compute-1 nova_compute[183083]: 2026-01-26 09:26:18.891 183087 DEBUG oslo_concurrency.lockutils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.014 183087 DEBUG nova.compute.provider_tree [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.037 183087 DEBUG nova.scheduler.client.report [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.070 183087 DEBUG oslo_concurrency.lockutils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.071 183087 DEBUG nova.compute.utils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.072 183087 ERROR nova.compute.manager [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Build of instance dfe86286-741f-4a72-9abb-35bbc3454093 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance dfe86286-741f-4a72-9abb-35bbc3454093 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.073 183087 DEBUG nova.compute.manager [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.074 183087 DEBUG nova.virt.libvirt.vif [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T09:26:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_marking_external_network-954045707',display_name='tempest-test_dscp_marking_external_network-954045707',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-test-dscp-marking-external-network-954045707',id=63,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHTnrywdQRERVrs/dmHTEpw5EEhKGncubXffvqoRqz1AvwjeIxkDVy5XUi6xE+e04GsW/1jy7/I4SIb+K6npW5RUNJSHtJQNKzd4jiDcVouw0QfYx0NnKq6j4e2NuVuzoQ==',key_name='tempest-keypair-test-898140873',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='28a4e49cff08498b810bd1f2903b2946',ramdisk_id='',reservation_id='r-666frx6i',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-201278879',owner_user_name='tempest-QosTestCommon-201278879-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:26:18Z,user_data=None,user_id='1a420e5f1acf4fc69a9a572e5dfb0a21',uuid=dfe86286-741f-4a72-9abb-35bbc3454093,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dce1ac8b-2ad6-456f-87a5-5a9c73c7420c", "address": "fa:16:3e:c8:b1:8b", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdce1ac8b-2a", "ovs_interfaceid": "dce1ac8b-2ad6-456f-87a5-5a9c73c7420c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.074 183087 DEBUG nova.network.os_vif_util [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Converting VIF {"id": "dce1ac8b-2ad6-456f-87a5-5a9c73c7420c", "address": "fa:16:3e:c8:b1:8b", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdce1ac8b-2a", "ovs_interfaceid": "dce1ac8b-2ad6-456f-87a5-5a9c73c7420c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.075 183087 DEBUG nova.network.os_vif_util [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:b1:8b,bridge_name='br-int',has_traffic_filtering=True,id=dce1ac8b-2ad6-456f-87a5-5a9c73c7420c,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdce1ac8b-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.076 183087 DEBUG os_vif [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:b1:8b,bridge_name='br-int',has_traffic_filtering=True,id=dce1ac8b-2ad6-456f-87a5-5a9c73c7420c,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdce1ac8b-2a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.077 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.078 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdce1ac8b-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.078 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.081 183087 INFO os_vif [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:b1:8b,bridge_name='br-int',has_traffic_filtering=True,id=dce1ac8b-2ad6-456f-87a5-5a9c73c7420c,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdce1ac8b-2a')
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.082 183087 DEBUG nova.compute.manager [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.082 183087 DEBUG nova.compute.manager [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.082 183087 DEBUG nova.network.neutron [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.890 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 810-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.891 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.892 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.892 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.938 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:26:19 compute-1 nova_compute[183083]: 2026-01-26 09:26:19.939 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:26:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:26:20.219 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:26:20 compute-1 nova_compute[183083]: 2026-01-26 09:26:20.219 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:26:20 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:26:20.222 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:26:20 compute-1 nova_compute[183083]: 2026-01-26 09:26:20.250 183087 DEBUG nova.network.neutron [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:26:20 compute-1 nova_compute[183083]: 2026-01-26 09:26:20.268 183087 DEBUG nova.network.neutron [req-4ed60542-df17-4f38-a761-3761dce0a37a req-aea95f75-1283-405c-b27b-b1f84577235a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Updated VIF entry in instance network info cache for port dce1ac8b-2ad6-456f-87a5-5a9c73c7420c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:26:20 compute-1 nova_compute[183083]: 2026-01-26 09:26:20.269 183087 DEBUG nova.network.neutron [req-4ed60542-df17-4f38-a761-3761dce0a37a req-aea95f75-1283-405c-b27b-b1f84577235a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Updating instance_info_cache with network_info: [{"id": "dce1ac8b-2ad6-456f-87a5-5a9c73c7420c", "address": "fa:16:3e:c8:b1:8b", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdce1ac8b-2a", "ovs_interfaceid": "dce1ac8b-2ad6-456f-87a5-5a9c73c7420c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:26:20 compute-1 nova_compute[183083]: 2026-01-26 09:26:20.270 183087 INFO nova.compute.manager [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: dfe86286-741f-4a72-9abb-35bbc3454093] Took 1.19 seconds to deallocate network for instance.
Jan 26 09:26:20 compute-1 nova_compute[183083]: 2026-01-26 09:26:20.305 183087 DEBUG oslo_concurrency.lockutils [req-4ed60542-df17-4f38-a761-3761dce0a37a req-aea95f75-1283-405c-b27b-b1f84577235a 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-dfe86286-741f-4a72-9abb-35bbc3454093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:26:20 compute-1 nova_compute[183083]: 2026-01-26 09:26:20.679 183087 INFO nova.scheduler.client.report [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Deleted allocations for instance dfe86286-741f-4a72-9abb-35bbc3454093
Jan 26 09:26:20 compute-1 nova_compute[183083]: 2026-01-26 09:26:20.680 183087 DEBUG oslo_concurrency.lockutils [None req-84262e67-54b5-403e-baa5-db70b6cf7de2 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "dfe86286-741f-4a72-9abb-35bbc3454093" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:26:23 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:26:23.226 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:26:24 compute-1 nova_compute[183083]: 2026-01-26 09:26:24.941 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:26:27 compute-1 nova_compute[183083]: 2026-01-26 09:26:27.540 183087 DEBUG oslo_concurrency.lockutils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Acquiring lock "73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:26:27 compute-1 nova_compute[183083]: 2026-01-26 09:26:27.541 183087 DEBUG oslo_concurrency.lockutils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:26:27 compute-1 nova_compute[183083]: 2026-01-26 09:26:27.615 183087 DEBUG nova.compute.manager [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 09:26:27 compute-1 nova_compute[183083]: 2026-01-26 09:26:27.906 183087 DEBUG oslo_concurrency.lockutils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:26:27 compute-1 nova_compute[183083]: 2026-01-26 09:26:27.906 183087 DEBUG oslo_concurrency.lockutils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:26:27 compute-1 nova_compute[183083]: 2026-01-26 09:26:27.915 183087 DEBUG nova.virt.hardware [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 09:26:27 compute-1 nova_compute[183083]: 2026-01-26 09:26:27.915 183087 INFO nova.compute.claims [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Claim successful on node compute-1.ctlplane.example.com
Jan 26 09:26:28 compute-1 nova_compute[183083]: 2026-01-26 09:26:28.177 183087 DEBUG nova.compute.provider_tree [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:26:28 compute-1 nova_compute[183083]: 2026-01-26 09:26:28.191 183087 DEBUG nova.scheduler.client.report [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:26:28 compute-1 nova_compute[183083]: 2026-01-26 09:26:28.209 183087 DEBUG oslo_concurrency.lockutils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:26:28 compute-1 nova_compute[183083]: 2026-01-26 09:26:28.211 183087 DEBUG nova.compute.manager [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 09:26:28 compute-1 nova_compute[183083]: 2026-01-26 09:26:28.252 183087 DEBUG nova.compute.manager [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 09:26:28 compute-1 nova_compute[183083]: 2026-01-26 09:26:28.252 183087 DEBUG nova.network.neutron [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 09:26:28 compute-1 nova_compute[183083]: 2026-01-26 09:26:28.280 183087 INFO nova.virt.libvirt.driver [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 09:26:28 compute-1 nova_compute[183083]: 2026-01-26 09:26:28.300 183087 DEBUG nova.compute.manager [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 09:26:28 compute-1 nova_compute[183083]: 2026-01-26 09:26:28.398 183087 DEBUG nova.compute.manager [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 09:26:28 compute-1 nova_compute[183083]: 2026-01-26 09:26:28.401 183087 DEBUG nova.virt.libvirt.driver [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 09:26:28 compute-1 nova_compute[183083]: 2026-01-26 09:26:28.402 183087 INFO nova.virt.libvirt.driver [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Creating image(s)
Jan 26 09:26:28 compute-1 nova_compute[183083]: 2026-01-26 09:26:28.402 183087 DEBUG oslo_concurrency.lockutils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Acquiring lock "/var/lib/nova/instances/73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:26:28 compute-1 nova_compute[183083]: 2026-01-26 09:26:28.402 183087 DEBUG oslo_concurrency.lockutils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "/var/lib/nova/instances/73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:26:28 compute-1 nova_compute[183083]: 2026-01-26 09:26:28.403 183087 DEBUG oslo_concurrency.lockutils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "/var/lib/nova/instances/73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:26:28 compute-1 nova_compute[183083]: 2026-01-26 09:26:28.404 183087 DEBUG oslo_concurrency.lockutils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:26:28 compute-1 nova_compute[183083]: 2026-01-26 09:26:28.404 183087 DEBUG oslo_concurrency.lockutils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:26:28 compute-1 nova_compute[183083]: 2026-01-26 09:26:28.564 183087 DEBUG nova.policy [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1a420e5f1acf4fc69a9a572e5dfb0a21', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '28a4e49cff08498b810bd1f2903b2946', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.317 183087 DEBUG oslo_concurrency.lockutils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.913s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Traceback (most recent call last):
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]     raise exception.ImageUnacceptable(
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] 
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] During handling of the above exception, another exception occurred:
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] 
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Traceback (most recent call last):
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]     yield resources
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]     self.driver.spawn(context, instance, image_meta,
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]     created_instance_dir, created_disks = self._create_image(
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]     created_disks = self._create_and_inject_local_root(
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]     image.cache(fetch_func=fetch_func,
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]     self.create_image(fetch_func_sync, base, size,
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]     prepare_template(target=base, *args, **kwargs)
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]     return f(*args, **kwargs)
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]     fetch_func(target=target, *args, **kwargs)
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b]     raise exception.ImageUnacceptable(
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.318 183087 ERROR nova.compute.manager [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] 
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.350 183087 DEBUG nova.network.neutron [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Successfully created port: 86778591-92fb-4ddc-8f58-8dbecc2c0138 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 09:26:29 compute-1 podman[231345]: 2026-01-26 09:26:29.825401427 +0000 UTC m=+0.069080913 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:26:29 compute-1 podman[231337]: 2026-01-26 09:26:29.839428093 +0000 UTC m=+0.085272260 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 09:26:29 compute-1 podman[231338]: 2026-01-26 09:26:29.848037317 +0000 UTC m=+0.095355865 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 26 09:26:29 compute-1 podman[231341]: 2026-01-26 09:26:29.850805045 +0000 UTC m=+0.095251812 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 09:26:29 compute-1 podman[231336]: 2026-01-26 09:26:29.86410361 +0000 UTC m=+0.121548585 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.979 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.981 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.981 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5038 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.981 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.981 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:26:29 compute-1 nova_compute[183083]: 2026-01-26 09:26:29.982 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:26:32 compute-1 nova_compute[183083]: 2026-01-26 09:26:32.558 183087 DEBUG nova.network.neutron [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Successfully updated port: 86778591-92fb-4ddc-8f58-8dbecc2c0138 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 09:26:32 compute-1 nova_compute[183083]: 2026-01-26 09:26:32.870 183087 DEBUG oslo_concurrency.lockutils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Acquiring lock "refresh_cache-73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:26:32 compute-1 nova_compute[183083]: 2026-01-26 09:26:32.871 183087 DEBUG oslo_concurrency.lockutils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Acquired lock "refresh_cache-73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:26:32 compute-1 nova_compute[183083]: 2026-01-26 09:26:32.871 183087 DEBUG nova.network.neutron [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 09:26:32 compute-1 nova_compute[183083]: 2026-01-26 09:26:32.895 183087 DEBUG nova.compute.manager [req-0fcf5e74-83c8-484e-a604-69f92c2d93be req-43fc70ee-bc24-4ba7-925b-a329ad1485b8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Received event network-changed-86778591-92fb-4ddc-8f58-8dbecc2c0138 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:26:32 compute-1 nova_compute[183083]: 2026-01-26 09:26:32.896 183087 DEBUG nova.compute.manager [req-0fcf5e74-83c8-484e-a604-69f92c2d93be req-43fc70ee-bc24-4ba7-925b-a329ad1485b8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Refreshing instance network info cache due to event network-changed-86778591-92fb-4ddc-8f58-8dbecc2c0138. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:26:32 compute-1 nova_compute[183083]: 2026-01-26 09:26:32.896 183087 DEBUG oslo_concurrency.lockutils [req-0fcf5e74-83c8-484e-a604-69f92c2d93be req-43fc70ee-bc24-4ba7-925b-a329ad1485b8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:26:33 compute-1 nova_compute[183083]: 2026-01-26 09:26:33.456 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:26:33 compute-1 nova_compute[183083]: 2026-01-26 09:26:33.457 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:26:33 compute-1 nova_compute[183083]: 2026-01-26 09:26:33.457 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:26:33 compute-1 nova_compute[183083]: 2026-01-26 09:26:33.531 183087 DEBUG nova.network.neutron [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 09:26:33 compute-1 nova_compute[183083]: 2026-01-26 09:26:33.679 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 26 09:26:33 compute-1 nova_compute[183083]: 2026-01-26 09:26:33.680 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.658 183087 DEBUG nova.network.neutron [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Updating instance_info_cache with network_info: [{"id": "86778591-92fb-4ddc-8f58-8dbecc2c0138", "address": "fa:16:3e:86:9e:4a", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86778591-92", "ovs_interfaceid": "86778591-92fb-4ddc-8f58-8dbecc2c0138", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.922 183087 DEBUG oslo_concurrency.lockutils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Releasing lock "refresh_cache-73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.922 183087 DEBUG nova.compute.manager [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Instance network_info: |[{"id": "86778591-92fb-4ddc-8f58-8dbecc2c0138", "address": "fa:16:3e:86:9e:4a", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86778591-92", "ovs_interfaceid": "86778591-92fb-4ddc-8f58-8dbecc2c0138", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.923 183087 DEBUG oslo_concurrency.lockutils [req-0fcf5e74-83c8-484e-a604-69f92c2d93be req-43fc70ee-bc24-4ba7-925b-a329ad1485b8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.923 183087 DEBUG nova.network.neutron [req-0fcf5e74-83c8-484e-a604-69f92c2d93be req-43fc70ee-bc24-4ba7-925b-a329ad1485b8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Refreshing network info cache for port 86778591-92fb-4ddc-8f58-8dbecc2c0138 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.924 183087 INFO nova.compute.manager [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Terminating instance
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.926 183087 DEBUG nova.compute.manager [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.929 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.929 183087 INFO nova.virt.libvirt.driver [-] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Instance destroyed successfully.
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.930 183087 DEBUG nova.virt.libvirt.vif [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T09:26:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_marking_south_north-1275375553',display_name='tempest-test_dscp_marking_south_north-1275375553',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-marking-south-north-1275375553',id=64,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHTnrywdQRERVrs/dmHTEpw5EEhKGncubXffvqoRqz1AvwjeIxkDVy5XUi6xE+e04GsW/1jy7/I4SIb+K6npW5RUNJSHtJQNKzd4jiDcVouw0QfYx0NnKq6j4e2NuVuzoQ==',key_name='tempest-keypair-test-898140873',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='28a4e49cff08498b810bd1f2903b2946',ramdisk_id='',reservation_id='r-9pzvpipr',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-201278879',owner_user_name='tempest-QosTestCommon-201278879-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:26:28Z,user_data=None,user_id='1a420e5f1acf4fc69a9a572e5dfb0a21',uuid=73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "86778591-92fb-4ddc-8f58-8dbecc2c0138", "address": "fa:16:3e:86:9e:4a", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86778591-92", "ovs_interfaceid": "86778591-92fb-4ddc-8f58-8dbecc2c0138", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.930 183087 DEBUG nova.network.os_vif_util [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Converting VIF {"id": "86778591-92fb-4ddc-8f58-8dbecc2c0138", "address": "fa:16:3e:86:9e:4a", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86778591-92", "ovs_interfaceid": "86778591-92fb-4ddc-8f58-8dbecc2c0138", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.931 183087 DEBUG nova.network.os_vif_util [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:9e:4a,bridge_name='br-int',has_traffic_filtering=True,id=86778591-92fb-4ddc-8f58-8dbecc2c0138,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86778591-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.931 183087 DEBUG os_vif [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:9e:4a,bridge_name='br-int',has_traffic_filtering=True,id=86778591-92fb-4ddc-8f58-8dbecc2c0138,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86778591-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.933 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.933 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap86778591-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.933 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.935 183087 INFO os_vif [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:9e:4a,bridge_name='br-int',has_traffic_filtering=True,id=86778591-92fb-4ddc-8f58-8dbecc2c0138,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86778591-92')
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.936 183087 INFO nova.virt.libvirt.driver [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Deleting instance files /var/lib/nova/instances/73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b_del
Jan 26 09:26:34 compute-1 nova_compute[183083]: 2026-01-26 09:26:34.936 183087 INFO nova.virt.libvirt.driver [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Deletion of /var/lib/nova/instances/73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b_del complete
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.043 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 48-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.170 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.190 183087 INFO nova.compute.manager [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Took 0.26 seconds to destroy the instance on the hypervisor.
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.191 183087 DEBUG nova.compute.claims [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c9855a580> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.191 183087 DEBUG oslo_concurrency.lockutils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.192 183087 DEBUG oslo_concurrency.lockutils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.457 183087 DEBUG nova.compute.provider_tree [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.475 183087 DEBUG nova.scheduler.client.report [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.500 183087 DEBUG oslo_concurrency.lockutils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.309s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.501 183087 DEBUG nova.compute.utils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.502 183087 ERROR nova.compute.manager [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Build of instance 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.502 183087 DEBUG nova.compute.manager [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.503 183087 DEBUG nova.virt.libvirt.vif [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T09:26:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_marking_south_north-1275375553',display_name='tempest-test_dscp_marking_south_north-1275375553',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-test-dscp-marking-south-north-1275375553',id=64,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHTnrywdQRERVrs/dmHTEpw5EEhKGncubXffvqoRqz1AvwjeIxkDVy5XUi6xE+e04GsW/1jy7/I4SIb+K6npW5RUNJSHtJQNKzd4jiDcVouw0QfYx0NnKq6j4e2NuVuzoQ==',key_name='tempest-keypair-test-898140873',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='28a4e49cff08498b810bd1f2903b2946',ramdisk_id='',reservation_id='r-9pzvpipr',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-201278879',owner_user_name='tempest-QosTestCommon-201278879-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:26:35Z,user_data=None,user_id='1a420e5f1acf4fc69a9a572e5dfb0a21',uuid=73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "86778591-92fb-4ddc-8f58-8dbecc2c0138", "address": "fa:16:3e:86:9e:4a", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86778591-92", "ovs_interfaceid": "86778591-92fb-4ddc-8f58-8dbecc2c0138", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.503 183087 DEBUG nova.network.os_vif_util [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Converting VIF {"id": "86778591-92fb-4ddc-8f58-8dbecc2c0138", "address": "fa:16:3e:86:9e:4a", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86778591-92", "ovs_interfaceid": "86778591-92fb-4ddc-8f58-8dbecc2c0138", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.503 183087 DEBUG nova.network.os_vif_util [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:9e:4a,bridge_name='br-int',has_traffic_filtering=True,id=86778591-92fb-4ddc-8f58-8dbecc2c0138,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86778591-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.504 183087 DEBUG os_vif [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:9e:4a,bridge_name='br-int',has_traffic_filtering=True,id=86778591-92fb-4ddc-8f58-8dbecc2c0138,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86778591-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.505 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.505 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap86778591-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.505 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.507 183087 INFO os_vif [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:9e:4a,bridge_name='br-int',has_traffic_filtering=True,id=86778591-92fb-4ddc-8f58-8dbecc2c0138,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86778591-92')
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.507 183087 DEBUG nova.compute.manager [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.507 183087 DEBUG nova.compute.manager [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 09:26:35 compute-1 nova_compute[183083]: 2026-01-26 09:26:35.508 183087 DEBUG nova.network.neutron [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 09:26:36 compute-1 nova_compute[183083]: 2026-01-26 09:26:36.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:26:37 compute-1 nova_compute[183083]: 2026-01-26 09:26:37.252 183087 DEBUG nova.network.neutron [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:26:37 compute-1 nova_compute[183083]: 2026-01-26 09:26:37.300 183087 INFO nova.compute.manager [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Took 1.79 seconds to deallocate network for instance.
Jan 26 09:26:37 compute-1 nova_compute[183083]: 2026-01-26 09:26:37.878 183087 INFO nova.scheduler.client.report [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Deleted allocations for instance 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b
Jan 26 09:26:37 compute-1 nova_compute[183083]: 2026-01-26 09:26:37.879 183087 DEBUG oslo_concurrency.lockutils [None req-c4f12cfd-7ef1-4d83-8650-1438ce097d56 1a420e5f1acf4fc69a9a572e5dfb0a21 28a4e49cff08498b810bd1f2903b2946 - - default default] Lock "73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:26:38 compute-1 nova_compute[183083]: 2026-01-26 09:26:38.060 183087 DEBUG nova.network.neutron [req-0fcf5e74-83c8-484e-a604-69f92c2d93be req-43fc70ee-bc24-4ba7-925b-a329ad1485b8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Updated VIF entry in instance network info cache for port 86778591-92fb-4ddc-8f58-8dbecc2c0138. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:26:38 compute-1 nova_compute[183083]: 2026-01-26 09:26:38.061 183087 DEBUG nova.network.neutron [req-0fcf5e74-83c8-484e-a604-69f92c2d93be req-43fc70ee-bc24-4ba7-925b-a329ad1485b8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b] Updating instance_info_cache with network_info: [{"id": "86778591-92fb-4ddc-8f58-8dbecc2c0138", "address": "fa:16:3e:86:9e:4a", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86778591-92", "ovs_interfaceid": "86778591-92fb-4ddc-8f58-8dbecc2c0138", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:26:38 compute-1 nova_compute[183083]: 2026-01-26 09:26:38.086 183087 DEBUG oslo_concurrency.lockutils [req-0fcf5e74-83c8-484e-a604-69f92c2d93be req-43fc70ee-bc24-4ba7-925b-a329ad1485b8 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-73aff6c6-126b-41d0-a3d4-b5cba0ebdb5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:26:38 compute-1 nova_compute[183083]: 2026-01-26 09:26:38.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:26:38 compute-1 nova_compute[183083]: 2026-01-26 09:26:38.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:26:39 compute-1 nova_compute[183083]: 2026-01-26 09:26:39.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:26:40 compute-1 nova_compute[183083]: 2026-01-26 09:26:40.044 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4537-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:26:41 compute-1 nova_compute[183083]: 2026-01-26 09:26:41.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:26:42 compute-1 nova_compute[183083]: 2026-01-26 09:26:42.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:26:42 compute-1 nova_compute[183083]: 2026-01-26 09:26:42.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:26:44 compute-1 sshd-session[231441]: Connection closed by authenticating user root 178.62.249.31 port 39282 [preauth]
Jan 26 09:26:44 compute-1 podman[231443]: 2026-01-26 09:26:44.54031238 +0000 UTC m=+0.051455535 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 09:26:45 compute-1 nova_compute[183083]: 2026-01-26 09:26:45.047 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:26:46 compute-1 ovn_controller[95352]: 2026-01-26T09:26:46Z|00385|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 26 09:26:50 compute-1 nova_compute[183083]: 2026-01-26 09:26:50.049 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:26:51 compute-1 nova_compute[183083]: 2026-01-26 09:26:51.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:26:51 compute-1 nova_compute[183083]: 2026-01-26 09:26:51.996 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:26:51 compute-1 nova_compute[183083]: 2026-01-26 09:26:51.997 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:26:51 compute-1 nova_compute[183083]: 2026-01-26 09:26:51.997 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:26:51 compute-1 nova_compute[183083]: 2026-01-26 09:26:51.997 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:26:52 compute-1 nova_compute[183083]: 2026-01-26 09:26:52.192 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:26:52 compute-1 nova_compute[183083]: 2026-01-26 09:26:52.194 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13651MB free_disk=113.0837287902832GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:26:52 compute-1 nova_compute[183083]: 2026-01-26 09:26:52.194 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:26:52 compute-1 nova_compute[183083]: 2026-01-26 09:26:52.194 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:26:52 compute-1 nova_compute[183083]: 2026-01-26 09:26:52.346 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:26:52 compute-1 nova_compute[183083]: 2026-01-26 09:26:52.347 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:26:52 compute-1 nova_compute[183083]: 2026-01-26 09:26:52.369 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:26:52 compute-1 nova_compute[183083]: 2026-01-26 09:26:52.384 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:26:52 compute-1 nova_compute[183083]: 2026-01-26 09:26:52.404 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:26:52 compute-1 nova_compute[183083]: 2026-01-26 09:26:52.404 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:26:55 compute-1 nova_compute[183083]: 2026-01-26 09:26:55.051 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:26:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:26:59.931 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:26:59 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:26:59.932 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:26:59 compute-1 nova_compute[183083]: 2026-01-26 09:26:59.984 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:27:00 compute-1 nova_compute[183083]: 2026-01-26 09:27:00.055 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:27:00 compute-1 podman[231474]: 2026-01-26 09:27:00.807627066 +0000 UTC m=+0.060039527 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350)
Jan 26 09:27:00 compute-1 podman[231468]: 2026-01-26 09:27:00.813301846 +0000 UTC m=+0.068355792 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 09:27:00 compute-1 podman[231475]: 2026-01-26 09:27:00.824836062 +0000 UTC m=+0.070898444 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:27:00 compute-1 podman[231467]: 2026-01-26 09:27:00.825313846 +0000 UTC m=+0.092099114 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 26 09:27:00 compute-1 podman[231488]: 2026-01-26 09:27:00.825575793 +0000 UTC m=+0.062097775 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:27:02 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:27:02.940 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:27:05 compute-1 nova_compute[183083]: 2026-01-26 09:27:05.058 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:27:05 compute-1 nova_compute[183083]: 2026-01-26 09:27:05.059 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:27:05 compute-1 nova_compute[183083]: 2026-01-26 09:27:05.060 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:27:05 compute-1 nova_compute[183083]: 2026-01-26 09:27:05.060 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:27:05 compute-1 nova_compute[183083]: 2026-01-26 09:27:05.060 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:27:05 compute-1 nova_compute[183083]: 2026-01-26 09:27:05.061 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:27:05 compute-1 ovn_controller[95352]: 2026-01-26T09:27:05Z|00386|pinctrl|WARN|Dropped 327 log messages in last 61 seconds (most recently, 2 seconds ago) due to excessive rate
Jan 26 09:27:05 compute-1 ovn_controller[95352]: 2026-01-26T09:27:05Z|00387|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:27:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:27:05.346 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:27:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:27:05.347 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:27:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:27:05.347 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:27:06 compute-1 sshd-session[231565]: Accepted publickey for zuul from 38.102.83.66 port 34144 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:27:06 compute-1 systemd-logind[788]: New session 144 of user zuul.
Jan 26 09:27:06 compute-1 systemd[1]: Started Session 144 of User zuul.
Jan 26 09:27:06 compute-1 sshd-session[231565]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:27:06 compute-1 sshd-session[231568]: Connection closed by 38.102.83.66 port 34144
Jan 26 09:27:06 compute-1 sshd-session[231565]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:27:06 compute-1 systemd[1]: session-144.scope: Deactivated successfully.
Jan 26 09:27:06 compute-1 systemd-logind[788]: Session 144 logged out. Waiting for processes to exit.
Jan 26 09:27:06 compute-1 systemd-logind[788]: Removed session 144.
Jan 26 09:27:10 compute-1 nova_compute[183083]: 2026-01-26 09:27:10.062 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:27:14 compute-1 podman[231592]: 2026-01-26 09:27:14.84259592 +0000 UTC m=+0.091270499 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 09:27:15 compute-1 nova_compute[183083]: 2026-01-26 09:27:15.066 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:27:15 compute-1 nova_compute[183083]: 2026-01-26 09:27:15.068 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:27:15 compute-1 nova_compute[183083]: 2026-01-26 09:27:15.068 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:27:15 compute-1 nova_compute[183083]: 2026-01-26 09:27:15.068 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:27:15 compute-1 nova_compute[183083]: 2026-01-26 09:27:15.068 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:27:15 compute-1 nova_compute[183083]: 2026-01-26 09:27:15.069 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:27:20 compute-1 nova_compute[183083]: 2026-01-26 09:27:20.070 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:27:20 compute-1 nova_compute[183083]: 2026-01-26 09:27:20.071 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:27:20 compute-1 nova_compute[183083]: 2026-01-26 09:27:20.072 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:27:20 compute-1 nova_compute[183083]: 2026-01-26 09:27:20.072 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:27:20 compute-1 nova_compute[183083]: 2026-01-26 09:27:20.121 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:27:20 compute-1 nova_compute[183083]: 2026-01-26 09:27:20.122 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:27:25 compute-1 nova_compute[183083]: 2026-01-26 09:27:25.123 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:27:25 compute-1 nova_compute[183083]: 2026-01-26 09:27:25.125 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:27:25 compute-1 nova_compute[183083]: 2026-01-26 09:27:25.125 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:27:25 compute-1 nova_compute[183083]: 2026-01-26 09:27:25.125 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:27:25 compute-1 nova_compute[183083]: 2026-01-26 09:27:25.159 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:27:25 compute-1 nova_compute[183083]: 2026-01-26 09:27:25.160 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:27:28 compute-1 sshd-session[231616]: Connection closed by authenticating user root 178.62.249.31 port 51570 [preauth]
Jan 26 09:27:30 compute-1 nova_compute[183083]: 2026-01-26 09:27:30.160 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:27:30 compute-1 nova_compute[183083]: 2026-01-26 09:27:30.162 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:27:30 compute-1 nova_compute[183083]: 2026-01-26 09:27:30.163 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:27:30 compute-1 nova_compute[183083]: 2026-01-26 09:27:30.163 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:27:30 compute-1 nova_compute[183083]: 2026-01-26 09:27:30.195 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:27:30 compute-1 nova_compute[183083]: 2026-01-26 09:27:30.196 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:27:31 compute-1 podman[231621]: 2026-01-26 09:27:31.830956248 +0000 UTC m=+0.070997787 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 09:27:31 compute-1 podman[231620]: 2026-01-26 09:27:31.848563165 +0000 UTC m=+0.095662534 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public)
Jan 26 09:27:31 compute-1 podman[231619]: 2026-01-26 09:27:31.858199888 +0000 UTC m=+0.117261014 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 09:27:31 compute-1 podman[231622]: 2026-01-26 09:27:31.860003619 +0000 UTC m=+0.106017027 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 09:27:31 compute-1 podman[231618]: 2026-01-26 09:27:31.874106157 +0000 UTC m=+0.127386160 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:27:33 compute-1 nova_compute[183083]: 2026-01-26 09:27:33.405 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:27:33 compute-1 nova_compute[183083]: 2026-01-26 09:27:33.405 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:27:33 compute-1 nova_compute[183083]: 2026-01-26 09:27:33.406 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:27:33 compute-1 nova_compute[183083]: 2026-01-26 09:27:33.426 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:27:35 compute-1 nova_compute[183083]: 2026-01-26 09:27:35.196 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:27:35 compute-1 nova_compute[183083]: 2026-01-26 09:27:35.968 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.280 183087 DEBUG oslo_concurrency.lockutils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Acquiring lock "83cdc3d1-53a7-463c-b1f0-52a035b0f3b0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.280 183087 DEBUG oslo_concurrency.lockutils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Lock "83cdc3d1-53a7-463c-b1f0-52a035b0f3b0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.306 183087 DEBUG nova.compute.manager [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.378 183087 DEBUG oslo_concurrency.lockutils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.379 183087 DEBUG oslo_concurrency.lockutils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.385 183087 DEBUG nova.virt.hardware [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.386 183087 INFO nova.compute.claims [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Claim successful on node compute-1.ctlplane.example.com
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.495 183087 DEBUG nova.compute.provider_tree [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.509 183087 DEBUG nova.scheduler.client.report [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.530 183087 DEBUG oslo_concurrency.lockutils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.531 183087 DEBUG nova.compute.manager [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.613 183087 DEBUG nova.compute.manager [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.613 183087 DEBUG nova.network.neutron [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.635 183087 INFO nova.virt.libvirt.driver [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.653 183087 DEBUG nova.compute.manager [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.734 183087 DEBUG nova.compute.manager [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.735 183087 DEBUG nova.virt.libvirt.driver [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.735 183087 INFO nova.virt.libvirt.driver [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Creating image(s)
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.736 183087 DEBUG oslo_concurrency.lockutils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Acquiring lock "/var/lib/nova/instances/83cdc3d1-53a7-463c-b1f0-52a035b0f3b0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.736 183087 DEBUG oslo_concurrency.lockutils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Lock "/var/lib/nova/instances/83cdc3d1-53a7-463c-b1f0-52a035b0f3b0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.736 183087 DEBUG oslo_concurrency.lockutils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Lock "/var/lib/nova/instances/83cdc3d1-53a7-463c-b1f0-52a035b0f3b0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.737 183087 DEBUG oslo_concurrency.lockutils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:27:36 compute-1 nova_compute[183083]: 2026-01-26 09:27:36.737 183087 DEBUG oslo_concurrency.lockutils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:27:37 compute-1 nova_compute[183083]: 2026-01-26 09:27:37.403 183087 DEBUG nova.policy [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ffcd5344b83e4d8594fb929f9a99b872', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0a7c8e73072047ca925697b0fc32a9b2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 26 09:27:37 compute-1 nova_compute[183083]: 2026-01-26 09:27:37.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 DEBUG oslo_concurrency.lockutils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Instance failed to spawn: nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Traceback (most recent call last):
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 170, in do_image_deep_inspection
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]     raise exception.ImageUnacceptable(
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image content does not match disk_format
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] 
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] During handling of the above exception, another exception occurred:
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] 
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Traceback (most recent call last):
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]     yield resources
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]     self.driver.spawn(context, instance, image_meta,
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4384, in spawn
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]     created_instance_dir, created_disks = self._create_image(
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4786, in _create_image
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]     created_disks = self._create_and_inject_local_root(
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4917, in _create_and_inject_local_root
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]     self._try_fetch_image_cache(backend, fetch_func, context,
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 10963, in _try_fetch_image_cache
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]     image.cache(fetch_func=fetch_func,
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 288, in cache
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]     self.create_image(fetch_func_sync, base, size,
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 663, in create_image
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]     prepare_template(target=base, *args, **kwargs)
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]     return f(*args, **kwargs)
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py", line 285, in fetch_func_sync
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]     fetch_func(target=target, *args, **kwargs)
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/utils.py", line 482, in fetch_image
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]     images.fetch_to_raw(context, image_id, target, trusted_certs)
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 199, in fetch_to_raw
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]     force_format = do_image_deep_inspection(img, image_href, path_tmp)
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]   File "/usr/lib/python3.9/site-packages/nova/virt/images.py", line 180, in do_image_deep_inspection
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0]     raise exception.ImageUnacceptable(
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] nova.exception.ImageUnacceptable: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.034 183087 ERROR nova.compute.manager [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] 
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.442 183087 DEBUG nova.network.neutron [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Successfully created port: 0b709736-8d9f-4bcd-a0a9-bb3e838249f8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:27:38 compute-1 nova_compute[183083]: 2026-01-26 09:27:38.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 26 09:27:39 compute-1 nova_compute[183083]: 2026-01-26 09:27:39.031 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 26 09:27:39 compute-1 nova_compute[183083]: 2026-01-26 09:27:39.847 183087 DEBUG nova.network.neutron [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Successfully updated port: 0b709736-8d9f-4bcd-a0a9-bb3e838249f8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 26 09:27:39 compute-1 nova_compute[183083]: 2026-01-26 09:27:39.861 183087 DEBUG oslo_concurrency.lockutils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Acquiring lock "refresh_cache-83cdc3d1-53a7-463c-b1f0-52a035b0f3b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:27:39 compute-1 nova_compute[183083]: 2026-01-26 09:27:39.862 183087 DEBUG oslo_concurrency.lockutils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Acquired lock "refresh_cache-83cdc3d1-53a7-463c-b1f0-52a035b0f3b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:27:39 compute-1 nova_compute[183083]: 2026-01-26 09:27:39.862 183087 DEBUG nova.network.neutron [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.030 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.044 183087 DEBUG nova.network.neutron [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.198 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.200 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.201 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.201 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.259 183087 DEBUG nova.compute.manager [req-c7d1e74d-3e20-4f73-8bf6-c96c14a51d39 req-ad26cc76-1c25-4531-98a8-831e540579df 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Received event network-changed-0b709736-8d9f-4bcd-a0a9-bb3e838249f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.260 183087 DEBUG nova.compute.manager [req-c7d1e74d-3e20-4f73-8bf6-c96c14a51d39 req-ad26cc76-1c25-4531-98a8-831e540579df 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Refreshing instance network info cache due to event network-changed-0b709736-8d9f-4bcd-a0a9-bb3e838249f8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.260 183087 DEBUG oslo_concurrency.lockutils [req-c7d1e74d-3e20-4f73-8bf6-c96c14a51d39 req-ad26cc76-1c25-4531-98a8-831e540579df 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquiring lock "refresh_cache-83cdc3d1-53a7-463c-b1f0-52a035b0f3b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.313 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.314 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.780 183087 DEBUG nova.network.neutron [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Updating instance_info_cache with network_info: [{"id": "0b709736-8d9f-4bcd-a0a9-bb3e838249f8", "address": "fa:16:3e:47:96:71", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.216", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b709736-8d", "ovs_interfaceid": "0b709736-8d9f-4bcd-a0a9-bb3e838249f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.802 183087 DEBUG oslo_concurrency.lockutils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Releasing lock "refresh_cache-83cdc3d1-53a7-463c-b1f0-52a035b0f3b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.803 183087 DEBUG nova.compute.manager [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Instance network_info: |[{"id": "0b709736-8d9f-4bcd-a0a9-bb3e838249f8", "address": "fa:16:3e:47:96:71", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.216", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b709736-8d", "ovs_interfaceid": "0b709736-8d9f-4bcd-a0a9-bb3e838249f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.803 183087 DEBUG oslo_concurrency.lockutils [req-c7d1e74d-3e20-4f73-8bf6-c96c14a51d39 req-ad26cc76-1c25-4531-98a8-831e540579df 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Acquired lock "refresh_cache-83cdc3d1-53a7-463c-b1f0-52a035b0f3b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.803 183087 DEBUG nova.network.neutron [req-c7d1e74d-3e20-4f73-8bf6-c96c14a51d39 req-ad26cc76-1c25-4531-98a8-831e540579df 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Refreshing network info cache for port 0b709736-8d9f-4bcd-a0a9-bb3e838249f8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.805 183087 INFO nova.compute.manager [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Terminating instance
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.806 183087 DEBUG nova.compute.manager [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.810 183087 DEBUG nova.virt.libvirt.driver [-] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.810 183087 INFO nova.virt.libvirt.driver [-] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Instance destroyed successfully.
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.811 183087 DEBUG nova.virt.libvirt.vif [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T09:27:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_bwlimit_external_network-43655487',display_name='tempest-test_dscp_bwlimit_external_network-43655487',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-bwlimit-external-network-43655487',id=65,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP6oeYPeA5RrSUXhYqJBWAn1S2YqvNxL4Hs4NlOzD3flmpoGtNR69SvuJDEtrRVTJmj7rq42VWq/55huIrOeSTpL7d3XijxHsujdlM10YOha83Sk3jVyKec7f+8IAJEt+A==',key_name='tempest-keypair-test-1597830972',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0a7c8e73072047ca925697b0fc32a9b2',ramdisk_id='',reservation_id='r-k6v555dw',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestExternalNetwork-2042757181',owner_user_name='tempest-QosTestExternalNetwork-2042757181-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:27:36Z,user_data=None,user_id='ffcd5344b83e4d8594fb929f9a99b872',uuid=83cdc3d1-53a7-463c-b1f0-52a035b0f3b0,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b709736-8d9f-4bcd-a0a9-bb3e838249f8", "address": "fa:16:3e:47:96:71", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.216", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b709736-8d", "ovs_interfaceid": "0b709736-8d9f-4bcd-a0a9-bb3e838249f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.811 183087 DEBUG nova.network.os_vif_util [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Converting VIF {"id": "0b709736-8d9f-4bcd-a0a9-bb3e838249f8", "address": "fa:16:3e:47:96:71", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.216", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b709736-8d", "ovs_interfaceid": "0b709736-8d9f-4bcd-a0a9-bb3e838249f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.812 183087 DEBUG nova.network.os_vif_util [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:96:71,bridge_name='br-int',has_traffic_filtering=True,id=0b709736-8d9f-4bcd-a0a9-bb3e838249f8,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b709736-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.812 183087 DEBUG os_vif [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:96:71,bridge_name='br-int',has_traffic_filtering=True,id=0b709736-8d9f-4bcd-a0a9-bb3e838249f8,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b709736-8d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.813 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.814 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b709736-8d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.814 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.816 183087 INFO os_vif [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:96:71,bridge_name='br-int',has_traffic_filtering=True,id=0b709736-8d9f-4bcd-a0a9-bb3e838249f8,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b709736-8d')
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.816 183087 INFO nova.virt.libvirt.driver [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Deleting instance files /var/lib/nova/instances/83cdc3d1-53a7-463c-b1f0-52a035b0f3b0_del
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.817 183087 INFO nova.virt.libvirt.driver [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Deletion of /var/lib/nova/instances/83cdc3d1-53a7-463c-b1f0-52a035b0f3b0_del complete
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.872 183087 INFO nova.compute.manager [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Took 0.07 seconds to destroy the instance on the hypervisor.
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.874 183087 DEBUG nova.compute.claims [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Aborting claim: <nova.compute.claims.Claim object at 0x7f6c9858bb50> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.874 183087 DEBUG oslo_concurrency.lockutils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.874 183087 DEBUG oslo_concurrency.lockutils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:27:40 compute-1 nova_compute[183083]: 2026-01-26 09:27:40.983 183087 DEBUG nova.compute.provider_tree [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.000 183087 DEBUG nova.scheduler.client.report [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.027 183087 DEBUG oslo_concurrency.lockutils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.028 183087 DEBUG nova.compute.utils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.028 183087 ERROR nova.compute.manager [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Build of instance 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format: nova.exception.BuildAbortException: Build of instance 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0 aborted: Image 11111111-1111-1111-1111-111111111111 is unacceptable: Image not in a supported format
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.029 183087 DEBUG nova.compute.manager [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.030 183087 DEBUG nova.virt.libvirt.vif [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2026-01-26T09:27:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_bwlimit_external_network-43655487',display_name='tempest-test_dscp_bwlimit_external_network-43655487',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-test-dscp-bwlimit-external-network-43655487',id=65,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP6oeYPeA5RrSUXhYqJBWAn1S2YqvNxL4Hs4NlOzD3flmpoGtNR69SvuJDEtrRVTJmj7rq42VWq/55huIrOeSTpL7d3XijxHsujdlM10YOha83Sk3jVyKec7f+8IAJEt+A==',key_name='tempest-keypair-test-1597830972',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0a7c8e73072047ca925697b0fc32a9b2',ramdisk_id='',reservation_id='r-k6v555dw',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestExternalNetwork-2042757181',owner_user_name='tempest-QosTestExternalNetwork-2042757181-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T09:27:40Z,user_data=None,user_id='ffcd5344b83e4d8594fb929f9a99b872',uuid=83cdc3d1-53a7-463c-b1f0-52a035b0f3b0,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b709736-8d9f-4bcd-a0a9-bb3e838249f8", "address": "fa:16:3e:47:96:71", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.216", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b709736-8d", "ovs_interfaceid": "0b709736-8d9f-4bcd-a0a9-bb3e838249f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.030 183087 DEBUG nova.network.os_vif_util [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Converting VIF {"id": "0b709736-8d9f-4bcd-a0a9-bb3e838249f8", "address": "fa:16:3e:47:96:71", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.216", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b709736-8d", "ovs_interfaceid": "0b709736-8d9f-4bcd-a0a9-bb3e838249f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.031 183087 DEBUG nova.network.os_vif_util [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:96:71,bridge_name='br-int',has_traffic_filtering=True,id=0b709736-8d9f-4bcd-a0a9-bb3e838249f8,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b709736-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.031 183087 DEBUG os_vif [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:96:71,bridge_name='br-int',has_traffic_filtering=True,id=0b709736-8d9f-4bcd-a0a9-bb3e838249f8,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b709736-8d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.033 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.033 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b709736-8d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.033 183087 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.035 183087 INFO os_vif [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:96:71,bridge_name='br-int',has_traffic_filtering=True,id=0b709736-8d9f-4bcd-a0a9-bb3e838249f8,network=Network(c56f24d7-f7eb-4695-9b9c-0626b55aea16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b709736-8d')
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.035 183087 DEBUG nova.compute.manager [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.036 183087 DEBUG nova.compute.manager [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.036 183087 DEBUG nova.network.neutron [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.613 183087 DEBUG nova.network.neutron [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.638 183087 INFO nova.compute.manager [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Took 0.60 seconds to deallocate network for instance.
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.976 183087 DEBUG nova.network.neutron [req-c7d1e74d-3e20-4f73-8bf6-c96c14a51d39 req-ad26cc76-1c25-4531-98a8-831e540579df 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Updated VIF entry in instance network info cache for port 0b709736-8d9f-4bcd-a0a9-bb3e838249f8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 26 09:27:41 compute-1 nova_compute[183083]: 2026-01-26 09:27:41.977 183087 DEBUG nova.network.neutron [req-c7d1e74d-3e20-4f73-8bf6-c96c14a51d39 req-ad26cc76-1c25-4531-98a8-831e540579df 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] [instance: 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0] Updating instance_info_cache with network_info: [{"id": "0b709736-8d9f-4bcd-a0a9-bb3e838249f8", "address": "fa:16:3e:47:96:71", "network": {"id": "c56f24d7-f7eb-4695-9b9c-0626b55aea16", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.216", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa88264c487a43b2855a58eb0dd042c9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b709736-8d", "ovs_interfaceid": "0b709736-8d9f-4bcd-a0a9-bb3e838249f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 26 09:27:42 compute-1 nova_compute[183083]: 2026-01-26 09:27:42.324 183087 DEBUG oslo_concurrency.lockutils [req-c7d1e74d-3e20-4f73-8bf6-c96c14a51d39 req-ad26cc76-1c25-4531-98a8-831e540579df 627fdf69448e401eb1d519e4b30e4d31 592b9aecaff9498e84899edc8724797a - - default default] Releasing lock "refresh_cache-83cdc3d1-53a7-463c-b1f0-52a035b0f3b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 26 09:27:42 compute-1 nova_compute[183083]: 2026-01-26 09:27:42.536 183087 INFO nova.scheduler.client.report [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Deleted allocations for instance 83cdc3d1-53a7-463c-b1f0-52a035b0f3b0
Jan 26 09:27:42 compute-1 nova_compute[183083]: 2026-01-26 09:27:42.537 183087 DEBUG oslo_concurrency.lockutils [None req-2923cf6c-cef5-4ab4-9014-c845b30e53ba ffcd5344b83e4d8594fb929f9a99b872 0a7c8e73072047ca925697b0fc32a9b2 - - default default] Lock "83cdc3d1-53a7-463c-b1f0-52a035b0f3b0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:27:43 compute-1 nova_compute[183083]: 2026-01-26 09:27:43.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:27:44 compute-1 nova_compute[183083]: 2026-01-26 09:27:44.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:27:44 compute-1 nova_compute[183083]: 2026-01-26 09:27:44.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:27:45 compute-1 nova_compute[183083]: 2026-01-26 09:27:45.314 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4280-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:27:45 compute-1 nova_compute[183083]: 2026-01-26 09:27:45.317 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:27:45 compute-1 nova_compute[183083]: 2026-01-26 09:27:45.317 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:27:45 compute-1 nova_compute[183083]: 2026-01-26 09:27:45.317 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:27:45 compute-1 nova_compute[183083]: 2026-01-26 09:27:45.348 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:27:45 compute-1 nova_compute[183083]: 2026-01-26 09:27:45.349 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:27:45 compute-1 nova_compute[183083]: 2026-01-26 09:27:45.528 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:27:45 compute-1 podman[231723]: 2026-01-26 09:27:45.811970978 +0000 UTC m=+0.064811192 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 09:27:50 compute-1 nova_compute[183083]: 2026-01-26 09:27:50.348 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:27:50 compute-1 nova_compute[183083]: 2026-01-26 09:27:50.350 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:27:50 compute-1 nova_compute[183083]: 2026-01-26 09:27:50.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:27:50 compute-1 nova_compute[183083]: 2026-01-26 09:27:50.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 26 09:27:53 compute-1 nova_compute[183083]: 2026-01-26 09:27:53.963 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:27:53 compute-1 nova_compute[183083]: 2026-01-26 09:27:53.988 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:27:53 compute-1 nova_compute[183083]: 2026-01-26 09:27:53.988 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:27:53 compute-1 nova_compute[183083]: 2026-01-26 09:27:53.989 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:27:53 compute-1 nova_compute[183083]: 2026-01-26 09:27:53.989 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:27:54 compute-1 nova_compute[183083]: 2026-01-26 09:27:54.169 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:27:54 compute-1 nova_compute[183083]: 2026-01-26 09:27:54.170 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13639MB free_disk=113.08374786376953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:27:54 compute-1 nova_compute[183083]: 2026-01-26 09:27:54.170 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:27:54 compute-1 nova_compute[183083]: 2026-01-26 09:27:54.171 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:27:54 compute-1 nova_compute[183083]: 2026-01-26 09:27:54.338 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:27:54 compute-1 nova_compute[183083]: 2026-01-26 09:27:54.338 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:27:54 compute-1 nova_compute[183083]: 2026-01-26 09:27:54.363 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:27:54 compute-1 nova_compute[183083]: 2026-01-26 09:27:54.378 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:27:54 compute-1 nova_compute[183083]: 2026-01-26 09:27:54.396 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:27:54 compute-1 nova_compute[183083]: 2026-01-26 09:27:54.396 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:27:55 compute-1 nova_compute[183083]: 2026-01-26 09:27:55.349 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:27:55 compute-1 nova_compute[183083]: 2026-01-26 09:27:55.350 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:27:55 compute-1 nova_compute[183083]: 2026-01-26 09:27:55.351 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:27:55 compute-1 nova_compute[183083]: 2026-01-26 09:27:55.351 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:27:55 compute-1 nova_compute[183083]: 2026-01-26 09:27:55.351 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:27:55 compute-1 nova_compute[183083]: 2026-01-26 09:27:55.352 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:28:00 compute-1 nova_compute[183083]: 2026-01-26 09:28:00.353 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:28:00 compute-1 nova_compute[183083]: 2026-01-26 09:28:00.355 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:28:00 compute-1 nova_compute[183083]: 2026-01-26 09:28:00.355 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:28:00 compute-1 nova_compute[183083]: 2026-01-26 09:28:00.355 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:28:00 compute-1 nova_compute[183083]: 2026-01-26 09:28:00.396 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:28:00 compute-1 nova_compute[183083]: 2026-01-26 09:28:00.397 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:28:01 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:28:01.644 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:28:01 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:28:01.645 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:28:01 compute-1 nova_compute[183083]: 2026-01-26 09:28:01.646 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:28:02 compute-1 sshd-session[231747]: Invalid user sol from 2.57.122.238 port 52224
Jan 26 09:28:02 compute-1 podman[231752]: 2026-01-26 09:28:02.826779223 +0000 UTC m=+0.089006306 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:28:02 compute-1 sshd-session[231747]: Connection closed by invalid user sol 2.57.122.238 port 52224 [preauth]
Jan 26 09:28:02 compute-1 podman[231749]: 2026-01-26 09:28:02.832190276 +0000 UTC m=+0.101703024 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:28:02 compute-1 podman[231753]: 2026-01-26 09:28:02.843190687 +0000 UTC m=+0.105354488 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 09:28:02 compute-1 podman[231750]: 2026-01-26 09:28:02.849302169 +0000 UTC m=+0.119031934 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 26 09:28:02 compute-1 podman[231751]: 2026-01-26 09:28:02.862214064 +0000 UTC m=+0.118049416 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:28:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:28:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:28:05.348 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:28:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:28:05.349 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:28:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:28:05.349 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:28:05 compute-1 nova_compute[183083]: 2026-01-26 09:28:05.397 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:28:05 compute-1 nova_compute[183083]: 2026-01-26 09:28:05.399 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:28:07 compute-1 nova_compute[183083]: 2026-01-26 09:28:07.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:28:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:28:08.648 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:28:08 compute-1 ovn_controller[95352]: 2026-01-26T09:28:08Z|00388|pinctrl|WARN|Dropped 311 log messages in last 64 seconds (most recently, 6 seconds ago) due to excessive rate
Jan 26 09:28:08 compute-1 ovn_controller[95352]: 2026-01-26T09:28:08Z|00389|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:28:10 compute-1 nova_compute[183083]: 2026-01-26 09:28:10.400 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:28:10 compute-1 nova_compute[183083]: 2026-01-26 09:28:10.401 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:28:10 compute-1 nova_compute[183083]: 2026-01-26 09:28:10.401 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:28:10 compute-1 nova_compute[183083]: 2026-01-26 09:28:10.401 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:28:10 compute-1 nova_compute[183083]: 2026-01-26 09:28:10.402 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:28:10 compute-1 nova_compute[183083]: 2026-01-26 09:28:10.403 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:28:11 compute-1 sshd-session[231856]: Accepted publickey for zuul from 38.102.83.66 port 36156 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:28:11 compute-1 systemd-logind[788]: New session 145 of user zuul.
Jan 26 09:28:11 compute-1 systemd[1]: Started Session 145 of User zuul.
Jan 26 09:28:11 compute-1 sshd-session[231856]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:28:12 compute-1 sshd-session[231859]: Connection closed by 38.102.83.66 port 36156
Jan 26 09:28:12 compute-1 sshd-session[231856]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:28:12 compute-1 systemd[1]: session-145.scope: Deactivated successfully.
Jan 26 09:28:12 compute-1 systemd-logind[788]: Session 145 logged out. Waiting for processes to exit.
Jan 26 09:28:12 compute-1 systemd-logind[788]: Removed session 145.
Jan 26 09:28:13 compute-1 sshd-session[231883]: Connection closed by authenticating user root 178.62.249.31 port 55162 [preauth]
Jan 26 09:28:15 compute-1 nova_compute[183083]: 2026-01-26 09:28:15.403 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:28:15 compute-1 nova_compute[183083]: 2026-01-26 09:28:15.406 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:28:15 compute-1 nova_compute[183083]: 2026-01-26 09:28:15.406 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:28:15 compute-1 nova_compute[183083]: 2026-01-26 09:28:15.406 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:28:15 compute-1 nova_compute[183083]: 2026-01-26 09:28:15.468 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:28:15 compute-1 nova_compute[183083]: 2026-01-26 09:28:15.469 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:28:16 compute-1 podman[231885]: 2026-01-26 09:28:16.795419145 +0000 UTC m=+0.056851178 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 09:28:20 compute-1 nova_compute[183083]: 2026-01-26 09:28:20.470 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:28:20 compute-1 nova_compute[183083]: 2026-01-26 09:28:20.472 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:28:20 compute-1 nova_compute[183083]: 2026-01-26 09:28:20.473 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:28:20 compute-1 nova_compute[183083]: 2026-01-26 09:28:20.473 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:28:20 compute-1 nova_compute[183083]: 2026-01-26 09:28:20.522 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:28:20 compute-1 nova_compute[183083]: 2026-01-26 09:28:20.523 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:28:25 compute-1 nova_compute[183083]: 2026-01-26 09:28:25.524 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:28:30 compute-1 nova_compute[183083]: 2026-01-26 09:28:30.570 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:28:31 compute-1 sshd-session[231910]: Accepted publickey for zuul from 38.102.83.66 port 35934 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:28:31 compute-1 systemd-logind[788]: New session 146 of user zuul.
Jan 26 09:28:31 compute-1 systemd[1]: Started Session 146 of User zuul.
Jan 26 09:28:31 compute-1 sshd-session[231910]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:28:31 compute-1 sshd-session[231913]: Connection closed by 38.102.83.66 port 35934
Jan 26 09:28:31 compute-1 sshd-session[231910]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:28:31 compute-1 systemd[1]: session-146.scope: Deactivated successfully.
Jan 26 09:28:31 compute-1 systemd-logind[788]: Session 146 logged out. Waiting for processes to exit.
Jan 26 09:28:31 compute-1 systemd-logind[788]: Removed session 146.
Jan 26 09:28:33 compute-1 podman[231939]: 2026-01-26 09:28:33.817525985 +0000 UTC m=+0.072606082 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, distribution-scope=public, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git)
Jan 26 09:28:33 compute-1 podman[231938]: 2026-01-26 09:28:33.831282294 +0000 UTC m=+0.078220431 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute)
Jan 26 09:28:33 compute-1 podman[231940]: 2026-01-26 09:28:33.835428731 +0000 UTC m=+0.087988037 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 09:28:33 compute-1 podman[231941]: 2026-01-26 09:28:33.835495773 +0000 UTC m=+0.082117631 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 09:28:33 compute-1 podman[231937]: 2026-01-26 09:28:33.844191649 +0000 UTC m=+0.105906444 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 09:28:34 compute-1 nova_compute[183083]: 2026-01-26 09:28:34.020 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:28:34 compute-1 nova_compute[183083]: 2026-01-26 09:28:34.020 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:28:34 compute-1 nova_compute[183083]: 2026-01-26 09:28:34.021 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:28:34 compute-1 nova_compute[183083]: 2026-01-26 09:28:34.043 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:28:35 compute-1 nova_compute[183083]: 2026-01-26 09:28:35.573 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:28:35 compute-1 nova_compute[183083]: 2026-01-26 09:28:35.575 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:28:35 compute-1 nova_compute[183083]: 2026-01-26 09:28:35.575 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:28:35 compute-1 nova_compute[183083]: 2026-01-26 09:28:35.575 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:28:35 compute-1 nova_compute[183083]: 2026-01-26 09:28:35.593 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:28:35 compute-1 nova_compute[183083]: 2026-01-26 09:28:35.594 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:28:36 compute-1 nova_compute[183083]: 2026-01-26 09:28:36.970 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:28:38 compute-1 nova_compute[183083]: 2026-01-26 09:28:38.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:28:39 compute-1 nova_compute[183083]: 2026-01-26 09:28:39.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:28:40 compute-1 nova_compute[183083]: 2026-01-26 09:28:40.595 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:28:40 compute-1 nova_compute[183083]: 2026-01-26 09:28:40.597 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:28:40 compute-1 nova_compute[183083]: 2026-01-26 09:28:40.597 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:28:40 compute-1 nova_compute[183083]: 2026-01-26 09:28:40.597 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:28:40 compute-1 nova_compute[183083]: 2026-01-26 09:28:40.632 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:28:40 compute-1 nova_compute[183083]: 2026-01-26 09:28:40.634 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:28:40 compute-1 nova_compute[183083]: 2026-01-26 09:28:40.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:28:42 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:28:42.165 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:28:42 compute-1 nova_compute[183083]: 2026-01-26 09:28:42.165 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:28:42 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:28:42.167 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:28:43 compute-1 nova_compute[183083]: 2026-01-26 09:28:43.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:28:43 compute-1 nova_compute[183083]: 2026-01-26 09:28:43.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:28:44 compute-1 nova_compute[183083]: 2026-01-26 09:28:44.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:28:44 compute-1 nova_compute[183083]: 2026-01-26 09:28:44.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:28:45 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:28:45.169 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:28:45 compute-1 nova_compute[183083]: 2026-01-26 09:28:45.673 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:28:47 compute-1 podman[232042]: 2026-01-26 09:28:47.846702816 +0000 UTC m=+0.104189574 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 09:28:50 compute-1 nova_compute[183083]: 2026-01-26 09:28:50.676 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:28:50 compute-1 nova_compute[183083]: 2026-01-26 09:28:50.678 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:28:50 compute-1 nova_compute[183083]: 2026-01-26 09:28:50.678 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:28:50 compute-1 nova_compute[183083]: 2026-01-26 09:28:50.678 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:28:50 compute-1 nova_compute[183083]: 2026-01-26 09:28:50.716 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:28:50 compute-1 nova_compute[183083]: 2026-01-26 09:28:50.717 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:28:51 compute-1 sshd-session[232066]: Accepted publickey for zuul from 38.102.83.66 port 58074 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:28:51 compute-1 systemd-logind[788]: New session 147 of user zuul.
Jan 26 09:28:51 compute-1 systemd[1]: Started Session 147 of User zuul.
Jan 26 09:28:51 compute-1 sshd-session[232066]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:28:51 compute-1 sshd-session[232069]: Connection closed by 38.102.83.66 port 58074
Jan 26 09:28:51 compute-1 sshd-session[232066]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:28:51 compute-1 systemd[1]: session-147.scope: Deactivated successfully.
Jan 26 09:28:51 compute-1 systemd-logind[788]: Session 147 logged out. Waiting for processes to exit.
Jan 26 09:28:51 compute-1 systemd-logind[788]: Removed session 147.
Jan 26 09:28:53 compute-1 nova_compute[183083]: 2026-01-26 09:28:53.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:28:53 compute-1 nova_compute[183083]: 2026-01-26 09:28:53.985 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:28:53 compute-1 nova_compute[183083]: 2026-01-26 09:28:53.985 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:28:53 compute-1 nova_compute[183083]: 2026-01-26 09:28:53.985 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:28:53 compute-1 nova_compute[183083]: 2026-01-26 09:28:53.986 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:28:54 compute-1 nova_compute[183083]: 2026-01-26 09:28:54.126 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:28:54 compute-1 nova_compute[183083]: 2026-01-26 09:28:54.127 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13645MB free_disk=113.08374786376953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:28:54 compute-1 nova_compute[183083]: 2026-01-26 09:28:54.127 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:28:54 compute-1 nova_compute[183083]: 2026-01-26 09:28:54.128 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:28:54 compute-1 nova_compute[183083]: 2026-01-26 09:28:54.369 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:28:54 compute-1 nova_compute[183083]: 2026-01-26 09:28:54.370 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:28:54 compute-1 nova_compute[183083]: 2026-01-26 09:28:54.393 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:28:54 compute-1 nova_compute[183083]: 2026-01-26 09:28:54.428 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:28:54 compute-1 nova_compute[183083]: 2026-01-26 09:28:54.430 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:28:54 compute-1 nova_compute[183083]: 2026-01-26 09:28:54.431 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:28:55 compute-1 nova_compute[183083]: 2026-01-26 09:28:55.719 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:28:55 compute-1 nova_compute[183083]: 2026-01-26 09:28:55.721 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:28:55 compute-1 nova_compute[183083]: 2026-01-26 09:28:55.721 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:28:55 compute-1 nova_compute[183083]: 2026-01-26 09:28:55.721 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:28:55 compute-1 nova_compute[183083]: 2026-01-26 09:28:55.767 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:28:55 compute-1 nova_compute[183083]: 2026-01-26 09:28:55.768 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:28:58 compute-1 sshd-session[232093]: Connection closed by authenticating user root 178.62.249.31 port 38240 [preauth]
Jan 26 09:29:00 compute-1 nova_compute[183083]: 2026-01-26 09:29:00.770 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:29:00 compute-1 nova_compute[183083]: 2026-01-26 09:29:00.772 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:29:00 compute-1 nova_compute[183083]: 2026-01-26 09:29:00.772 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:29:00 compute-1 nova_compute[183083]: 2026-01-26 09:29:00.773 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:29:00 compute-1 nova_compute[183083]: 2026-01-26 09:29:00.822 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:29:00 compute-1 nova_compute[183083]: 2026-01-26 09:29:00.823 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:29:04 compute-1 podman[232098]: 2026-01-26 09:29:04.806047794 +0000 UTC m=+0.063209876 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 09:29:04 compute-1 ovn_controller[95352]: 2026-01-26T09:29:04Z|00390|pinctrl|WARN|Dropped 195 log messages in last 56 seconds (most recently, 3 seconds ago) due to excessive rate
Jan 26 09:29:04 compute-1 ovn_controller[95352]: 2026-01-26T09:29:04Z|00391|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:29:04 compute-1 podman[232096]: 2026-01-26 09:29:04.81155443 +0000 UTC m=+0.075549675 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 26 09:29:04 compute-1 podman[232104]: 2026-01-26 09:29:04.812148057 +0000 UTC m=+0.063956097 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:29:04 compute-1 podman[232095]: 2026-01-26 09:29:04.837898814 +0000 UTC m=+0.106221951 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 09:29:04 compute-1 podman[232097]: 2026-01-26 09:29:04.837947306 +0000 UTC m=+0.099525652 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Jan 26 09:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:29:05.350 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:29:05.350 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:29:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:29:05.351 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:29:05 compute-1 nova_compute[183083]: 2026-01-26 09:29:05.823 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:29:05 compute-1 nova_compute[183083]: 2026-01-26 09:29:05.824 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:29:05 compute-1 nova_compute[183083]: 2026-01-26 09:29:05.824 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:29:05 compute-1 nova_compute[183083]: 2026-01-26 09:29:05.824 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:29:05 compute-1 nova_compute[183083]: 2026-01-26 09:29:05.825 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:29:05 compute-1 nova_compute[183083]: 2026-01-26 09:29:05.826 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:29:10 compute-1 nova_compute[183083]: 2026-01-26 09:29:10.828 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:29:10 compute-1 nova_compute[183083]: 2026-01-26 09:29:10.831 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:29:10 compute-1 nova_compute[183083]: 2026-01-26 09:29:10.831 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:29:10 compute-1 nova_compute[183083]: 2026-01-26 09:29:10.831 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:29:10 compute-1 nova_compute[183083]: 2026-01-26 09:29:10.868 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:29:10 compute-1 nova_compute[183083]: 2026-01-26 09:29:10.870 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:29:15 compute-1 nova_compute[183083]: 2026-01-26 09:29:15.871 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:29:15 compute-1 nova_compute[183083]: 2026-01-26 09:29:15.873 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:29:18 compute-1 podman[232197]: 2026-01-26 09:29:18.811462046 +0000 UTC m=+0.065576414 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 09:29:20 compute-1 nova_compute[183083]: 2026-01-26 09:29:20.872 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:29:20 compute-1 nova_compute[183083]: 2026-01-26 09:29:20.874 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:29:20 compute-1 nova_compute[183083]: 2026-01-26 09:29:20.874 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:29:20 compute-1 nova_compute[183083]: 2026-01-26 09:29:20.874 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:29:20 compute-1 nova_compute[183083]: 2026-01-26 09:29:20.912 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:29:20 compute-1 nova_compute[183083]: 2026-01-26 09:29:20.913 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:29:25 compute-1 nova_compute[183083]: 2026-01-26 09:29:25.913 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:29:25 compute-1 nova_compute[183083]: 2026-01-26 09:29:25.915 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:29:25 compute-1 nova_compute[183083]: 2026-01-26 09:29:25.916 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:29:25 compute-1 nova_compute[183083]: 2026-01-26 09:29:25.916 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:29:25 compute-1 nova_compute[183083]: 2026-01-26 09:29:25.953 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:29:25 compute-1 nova_compute[183083]: 2026-01-26 09:29:25.954 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:29:30 compute-1 nova_compute[183083]: 2026-01-26 09:29:30.955 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:29:30 compute-1 nova_compute[183083]: 2026-01-26 09:29:30.957 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:29:30 compute-1 nova_compute[183083]: 2026-01-26 09:29:30.958 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:29:30 compute-1 nova_compute[183083]: 2026-01-26 09:29:30.958 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:29:30 compute-1 nova_compute[183083]: 2026-01-26 09:29:30.993 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:29:30 compute-1 nova_compute[183083]: 2026-01-26 09:29:30.993 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:29:35 compute-1 nova_compute[183083]: 2026-01-26 09:29:35.432 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:29:35 compute-1 nova_compute[183083]: 2026-01-26 09:29:35.432 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:29:35 compute-1 nova_compute[183083]: 2026-01-26 09:29:35.433 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:29:35 compute-1 nova_compute[183083]: 2026-01-26 09:29:35.451 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:29:35 compute-1 podman[232236]: 2026-01-26 09:29:35.807186662 +0000 UTC m=+0.054901682 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 09:29:35 compute-1 podman[232224]: 2026-01-26 09:29:35.808730586 +0000 UTC m=+0.066523180 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_id=openstack_network_exporter, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, architecture=x86_64, io.buildah.version=1.33.7)
Jan 26 09:29:35 compute-1 podman[232230]: 2026-01-26 09:29:35.810835915 +0000 UTC m=+0.060877231 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 26 09:29:35 compute-1 podman[232223]: 2026-01-26 09:29:35.839632999 +0000 UTC m=+0.100469689 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 26 09:29:35 compute-1 podman[232222]: 2026-01-26 09:29:35.861360183 +0000 UTC m=+0.127995797 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 09:29:35 compute-1 nova_compute[183083]: 2026-01-26 09:29:35.994 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:29:35 compute-1 nova_compute[183083]: 2026-01-26 09:29:35.995 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:29:36 compute-1 nova_compute[183083]: 2026-01-26 09:29:36.966 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:29:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:29:37.657 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:29:37 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:29:37.659 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:29:37 compute-1 nova_compute[183083]: 2026-01-26 09:29:37.658 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:29:39 compute-1 nova_compute[183083]: 2026-01-26 09:29:39.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:29:40 compute-1 nova_compute[183083]: 2026-01-26 09:29:40.953 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:29:40 compute-1 nova_compute[183083]: 2026-01-26 09:29:40.954 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:29:40 compute-1 nova_compute[183083]: 2026-01-26 09:29:40.998 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:29:43 compute-1 nova_compute[183083]: 2026-01-26 09:29:43.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:29:44 compute-1 nova_compute[183083]: 2026-01-26 09:29:44.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:29:44 compute-1 nova_compute[183083]: 2026-01-26 09:29:44.962 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:29:45 compute-1 sshd-session[232325]: Connection closed by authenticating user root 178.62.249.31 port 47656 [preauth]
Jan 26 09:29:45 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:29:45.663 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:29:45 compute-1 nova_compute[183083]: 2026-01-26 09:29:45.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:29:45 compute-1 nova_compute[183083]: 2026-01-26 09:29:45.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:29:46 compute-1 nova_compute[183083]: 2026-01-26 09:29:45.999 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:29:46 compute-1 sshd-session[232327]: Accepted publickey for zuul from 38.102.83.66 port 51826 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:29:46 compute-1 systemd-logind[788]: New session 148 of user zuul.
Jan 26 09:29:46 compute-1 systemd[1]: Started Session 148 of User zuul.
Jan 26 09:29:46 compute-1 sshd-session[232327]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:29:46 compute-1 sshd-session[232330]: Connection closed by 38.102.83.66 port 51826
Jan 26 09:29:46 compute-1 sshd-session[232327]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:29:46 compute-1 systemd[1]: session-148.scope: Deactivated successfully.
Jan 26 09:29:46 compute-1 systemd-logind[788]: Session 148 logged out. Waiting for processes to exit.
Jan 26 09:29:46 compute-1 systemd-logind[788]: Removed session 148.
Jan 26 09:29:49 compute-1 podman[232354]: 2026-01-26 09:29:49.824560839 +0000 UTC m=+0.082632006 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:29:51 compute-1 nova_compute[183083]: 2026-01-26 09:29:51.002 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:29:53 compute-1 nova_compute[183083]: 2026-01-26 09:29:53.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:29:53 compute-1 nova_compute[183083]: 2026-01-26 09:29:53.973 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:29:53 compute-1 nova_compute[183083]: 2026-01-26 09:29:53.974 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:29:53 compute-1 nova_compute[183083]: 2026-01-26 09:29:53.974 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:29:53 compute-1 nova_compute[183083]: 2026-01-26 09:29:53.974 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:29:54 compute-1 nova_compute[183083]: 2026-01-26 09:29:54.187 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:29:54 compute-1 nova_compute[183083]: 2026-01-26 09:29:54.189 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13646MB free_disk=113.08374786376953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:29:54 compute-1 nova_compute[183083]: 2026-01-26 09:29:54.189 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:29:54 compute-1 nova_compute[183083]: 2026-01-26 09:29:54.189 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:29:54 compute-1 nova_compute[183083]: 2026-01-26 09:29:54.255 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:29:54 compute-1 nova_compute[183083]: 2026-01-26 09:29:54.256 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:29:54 compute-1 nova_compute[183083]: 2026-01-26 09:29:54.286 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:29:54 compute-1 nova_compute[183083]: 2026-01-26 09:29:54.299 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:29:54 compute-1 nova_compute[183083]: 2026-01-26 09:29:54.302 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:29:54 compute-1 nova_compute[183083]: 2026-01-26 09:29:54.302 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:29:56 compute-1 nova_compute[183083]: 2026-01-26 09:29:56.005 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:30:01 compute-1 nova_compute[183083]: 2026-01-26 09:30:01.008 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:30:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:30:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:30:05.351 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:30:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:30:05.352 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:30:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:30:05.352 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:30:06 compute-1 nova_compute[183083]: 2026-01-26 09:30:06.010 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:30:06 compute-1 ovn_controller[95352]: 2026-01-26T09:30:06Z|00392|pinctrl|WARN|Dropped 279 log messages in last 62 seconds (most recently, 8 seconds ago) due to excessive rate
Jan 26 09:30:06 compute-1 ovn_controller[95352]: 2026-01-26T09:30:06Z|00393|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:30:06 compute-1 podman[232385]: 2026-01-26 09:30:06.830278998 +0000 UTC m=+0.075214696 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 09:30:06 compute-1 podman[232393]: 2026-01-26 09:30:06.83037278 +0000 UTC m=+0.068356812 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:30:06 compute-1 podman[232381]: 2026-01-26 09:30:06.850570681 +0000 UTC m=+0.103323940 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, release=1755695350)
Jan 26 09:30:06 compute-1 podman[232380]: 2026-01-26 09:30:06.858480454 +0000 UTC m=+0.105153531 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 26 09:30:06 compute-1 podman[232379]: 2026-01-26 09:30:06.884299004 +0000 UTC m=+0.142602070 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 09:30:09 compute-1 sshd-session[232485]: Invalid user stradal from 2.57.122.238 port 46536
Jan 26 09:30:09 compute-1 sshd-session[232485]: Connection closed by invalid user stradal 2.57.122.238 port 46536 [preauth]
Jan 26 09:30:11 compute-1 nova_compute[183083]: 2026-01-26 09:30:11.012 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:30:11 compute-1 nova_compute[183083]: 2026-01-26 09:30:11.015 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:30:11 compute-1 nova_compute[183083]: 2026-01-26 09:30:11.015 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:30:11 compute-1 nova_compute[183083]: 2026-01-26 09:30:11.015 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:30:11 compute-1 nova_compute[183083]: 2026-01-26 09:30:11.034 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:30:11 compute-1 nova_compute[183083]: 2026-01-26 09:30:11.034 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:30:16 compute-1 nova_compute[183083]: 2026-01-26 09:30:16.035 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:30:16 compute-1 nova_compute[183083]: 2026-01-26 09:30:16.037 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:30:20 compute-1 podman[232487]: 2026-01-26 09:30:20.850836076 +0000 UTC m=+0.100679225 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 09:30:21 compute-1 nova_compute[183083]: 2026-01-26 09:30:21.036 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:30:21 compute-1 nova_compute[183083]: 2026-01-26 09:30:21.038 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:30:21 compute-1 nova_compute[183083]: 2026-01-26 09:30:21.038 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:30:21 compute-1 nova_compute[183083]: 2026-01-26 09:30:21.038 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:30:21 compute-1 nova_compute[183083]: 2026-01-26 09:30:21.107 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:30:21 compute-1 nova_compute[183083]: 2026-01-26 09:30:21.108 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:30:26 compute-1 nova_compute[183083]: 2026-01-26 09:30:26.109 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:30:28 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:30:28.717 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:30:28 compute-1 nova_compute[183083]: 2026-01-26 09:30:28.718 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:30:28 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:30:28.720 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:30:31 compute-1 nova_compute[183083]: 2026-01-26 09:30:31.162 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:30:31 compute-1 sshd-session[232511]: Connection closed by authenticating user root 178.62.249.31 port 39444 [preauth]
Jan 26 09:30:32 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:30:32.723 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:30:35 compute-1 nova_compute[183083]: 2026-01-26 09:30:35.304 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:30:35 compute-1 nova_compute[183083]: 2026-01-26 09:30:35.304 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:30:35 compute-1 nova_compute[183083]: 2026-01-26 09:30:35.305 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:30:35 compute-1 nova_compute[183083]: 2026-01-26 09:30:35.348 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:30:36 compute-1 nova_compute[183083]: 2026-01-26 09:30:36.163 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:30:37 compute-1 podman[232514]: 2026-01-26 09:30:37.82736273 +0000 UTC m=+0.085906168 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:30:37 compute-1 podman[232526]: 2026-01-26 09:30:37.843920197 +0000 UTC m=+0.075465123 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:30:37 compute-1 podman[232520]: 2026-01-26 09:30:37.846776358 +0000 UTC m=+0.088612684 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 09:30:37 compute-1 podman[232513]: 2026-01-26 09:30:37.863749287 +0000 UTC m=+0.118585871 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:30:37 compute-1 podman[232515]: 2026-01-26 09:30:37.865830196 +0000 UTC m=+0.107483087 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 09:30:37 compute-1 nova_compute[183083]: 2026-01-26 09:30:37.991 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:30:39 compute-1 nova_compute[183083]: 2026-01-26 09:30:39.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:30:41 compute-1 nova_compute[183083]: 2026-01-26 09:30:41.166 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:30:41 compute-1 nova_compute[183083]: 2026-01-26 09:30:41.213 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:30:41 compute-1 nova_compute[183083]: 2026-01-26 09:30:41.214 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5050 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:30:41 compute-1 nova_compute[183083]: 2026-01-26 09:30:41.215 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:30:41 compute-1 nova_compute[183083]: 2026-01-26 09:30:41.216 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:30:41 compute-1 nova_compute[183083]: 2026-01-26 09:30:41.217 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:30:41 compute-1 nova_compute[183083]: 2026-01-26 09:30:41.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:30:42 compute-1 nova_compute[183083]: 2026-01-26 09:30:42.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:30:45 compute-1 nova_compute[183083]: 2026-01-26 09:30:45.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:30:46 compute-1 nova_compute[183083]: 2026-01-26 09:30:46.216 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:30:46 compute-1 nova_compute[183083]: 2026-01-26 09:30:46.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:30:47 compute-1 nova_compute[183083]: 2026-01-26 09:30:47.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:30:47 compute-1 nova_compute[183083]: 2026-01-26 09:30:47.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:30:51 compute-1 nova_compute[183083]: 2026-01-26 09:30:51.220 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:30:51 compute-1 nova_compute[183083]: 2026-01-26 09:30:51.222 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:30:51 compute-1 nova_compute[183083]: 2026-01-26 09:30:51.222 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:30:51 compute-1 nova_compute[183083]: 2026-01-26 09:30:51.222 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:30:51 compute-1 nova_compute[183083]: 2026-01-26 09:30:51.260 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:30:51 compute-1 nova_compute[183083]: 2026-01-26 09:30:51.261 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:30:51 compute-1 podman[232619]: 2026-01-26 09:30:51.833213031 +0000 UTC m=+0.089778787 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 09:30:55 compute-1 nova_compute[183083]: 2026-01-26 09:30:55.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:30:55 compute-1 nova_compute[183083]: 2026-01-26 09:30:55.988 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:30:55 compute-1 nova_compute[183083]: 2026-01-26 09:30:55.989 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:30:55 compute-1 nova_compute[183083]: 2026-01-26 09:30:55.989 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:30:55 compute-1 nova_compute[183083]: 2026-01-26 09:30:55.989 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.186 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.187 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13647MB free_disk=113.08374786376953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.187 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.187 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.245 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.245 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.263 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing inventories for resource provider 5203935e-446c-4e03-93fa-4c60d651e045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.265 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.266 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.266 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.266 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.318 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.318 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.467 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating ProviderTree inventory for provider 5203935e-446c-4e03-93fa-4c60d651e045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.467 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating inventory in ProviderTree for provider 5203935e-446c-4e03-93fa-4c60d651e045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.482 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing aggregate associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.512 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing trait associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.543 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.568 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.571 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:30:56 compute-1 nova_compute[183083]: 2026-01-26 09:30:56.571 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:31:01 compute-1 nova_compute[183083]: 2026-01-26 09:31:01.319 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:31:01 compute-1 nova_compute[183083]: 2026-01-26 09:31:01.322 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:31:01 compute-1 nova_compute[183083]: 2026-01-26 09:31:01.322 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:31:01 compute-1 nova_compute[183083]: 2026-01-26 09:31:01.322 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:31:01 compute-1 nova_compute[183083]: 2026-01-26 09:31:01.342 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:31:01 compute-1 nova_compute[183083]: 2026-01-26 09:31:01.343 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:31:03 compute-1 ovn_controller[95352]: 2026-01-26T09:31:03Z|00394|pinctrl|WARN|Dropped 193 log messages in last 57 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 26 09:31:03 compute-1 ovn_controller[95352]: 2026-01-26T09:31:03Z|00395|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:31:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:31:05.352 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:31:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:31:05.353 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:31:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:31:05.353 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:31:06 compute-1 nova_compute[183083]: 2026-01-26 09:31:06.343 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:31:06 compute-1 nova_compute[183083]: 2026-01-26 09:31:06.345 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:31:06 compute-1 nova_compute[183083]: 2026-01-26 09:31:06.345 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:31:06 compute-1 nova_compute[183083]: 2026-01-26 09:31:06.346 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:31:06 compute-1 nova_compute[183083]: 2026-01-26 09:31:06.388 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:31:06 compute-1 nova_compute[183083]: 2026-01-26 09:31:06.389 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:31:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:31:08.706 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:31:08 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:31:08.707 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:31:08 compute-1 nova_compute[183083]: 2026-01-26 09:31:08.709 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:31:08 compute-1 podman[232644]: 2026-01-26 09:31:08.852355969 +0000 UTC m=+0.099454741 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 26 09:31:08 compute-1 podman[232651]: 2026-01-26 09:31:08.852836843 +0000 UTC m=+0.093434221 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 09:31:08 compute-1 podman[232645]: 2026-01-26 09:31:08.864164233 +0000 UTC m=+0.102102766 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container)
Jan 26 09:31:08 compute-1 podman[232643]: 2026-01-26 09:31:08.874053882 +0000 UTC m=+0.132203986 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:31:08 compute-1 podman[232652]: 2026-01-26 09:31:08.886028801 +0000 UTC m=+0.114390413 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:31:11 compute-1 nova_compute[183083]: 2026-01-26 09:31:11.425 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:31:16 compute-1 nova_compute[183083]: 2026-01-26 09:31:16.427 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:31:16 compute-1 nova_compute[183083]: 2026-01-26 09:31:16.439 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:31:17 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:31:17.712 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:31:19 compute-1 sshd-session[232749]: Connection closed by authenticating user root 178.62.249.31 port 44832 [preauth]
Jan 26 09:31:21 compute-1 nova_compute[183083]: 2026-01-26 09:31:21.440 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:31:21 compute-1 nova_compute[183083]: 2026-01-26 09:31:21.442 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:31:21 compute-1 nova_compute[183083]: 2026-01-26 09:31:21.442 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:31:21 compute-1 nova_compute[183083]: 2026-01-26 09:31:21.442 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:31:21 compute-1 nova_compute[183083]: 2026-01-26 09:31:21.480 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:31:21 compute-1 nova_compute[183083]: 2026-01-26 09:31:21.481 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:31:22 compute-1 podman[232751]: 2026-01-26 09:31:22.811717578 +0000 UTC m=+0.077067499 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 09:31:26 compute-1 nova_compute[183083]: 2026-01-26 09:31:26.483 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:31:26 compute-1 nova_compute[183083]: 2026-01-26 09:31:26.484 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:31:26 compute-1 nova_compute[183083]: 2026-01-26 09:31:26.484 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:31:26 compute-1 nova_compute[183083]: 2026-01-26 09:31:26.485 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:31:26 compute-1 nova_compute[183083]: 2026-01-26 09:31:26.541 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:31:26 compute-1 nova_compute[183083]: 2026-01-26 09:31:26.542 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:31:31 compute-1 nova_compute[183083]: 2026-01-26 09:31:31.543 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:31:31 compute-1 nova_compute[183083]: 2026-01-26 09:31:31.544 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:31:31 compute-1 nova_compute[183083]: 2026-01-26 09:31:31.544 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:31:31 compute-1 nova_compute[183083]: 2026-01-26 09:31:31.545 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:31:31 compute-1 nova_compute[183083]: 2026-01-26 09:31:31.545 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:31:31 compute-1 nova_compute[183083]: 2026-01-26 09:31:31.547 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:31:36 compute-1 nova_compute[183083]: 2026-01-26 09:31:36.548 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:31:37 compute-1 nova_compute[183083]: 2026-01-26 09:31:37.572 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:31:37 compute-1 nova_compute[183083]: 2026-01-26 09:31:37.572 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:31:37 compute-1 nova_compute[183083]: 2026-01-26 09:31:37.573 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:31:37 compute-1 nova_compute[183083]: 2026-01-26 09:31:37.588 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:31:37 compute-1 nova_compute[183083]: 2026-01-26 09:31:37.962 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:31:39 compute-1 podman[232779]: 2026-01-26 09:31:39.831145934 +0000 UTC m=+0.073160768 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 26 09:31:39 compute-1 podman[232780]: 2026-01-26 09:31:39.831725811 +0000 UTC m=+0.073742365 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 09:31:39 compute-1 podman[232777]: 2026-01-26 09:31:39.840031395 +0000 UTC m=+0.084991812 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:31:39 compute-1 podman[232776]: 2026-01-26 09:31:39.875350643 +0000 UTC m=+0.125060354 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 09:31:39 compute-1 podman[232778]: 2026-01-26 09:31:39.875394554 +0000 UTC m=+0.119403524 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, name=ubi9-minimal, config_id=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 26 09:31:40 compute-1 nova_compute[183083]: 2026-01-26 09:31:40.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:31:41 compute-1 nova_compute[183083]: 2026-01-26 09:31:41.550 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:31:42 compute-1 nova_compute[183083]: 2026-01-26 09:31:42.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:31:42 compute-1 nova_compute[183083]: 2026-01-26 09:31:42.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:31:46 compute-1 nova_compute[183083]: 2026-01-26 09:31:46.552 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:31:46 compute-1 nova_compute[183083]: 2026-01-26 09:31:46.554 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:31:46 compute-1 nova_compute[183083]: 2026-01-26 09:31:46.554 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:31:46 compute-1 nova_compute[183083]: 2026-01-26 09:31:46.555 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:31:46 compute-1 nova_compute[183083]: 2026-01-26 09:31:46.582 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:31:46 compute-1 nova_compute[183083]: 2026-01-26 09:31:46.583 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:31:46 compute-1 nova_compute[183083]: 2026-01-26 09:31:46.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:31:47 compute-1 nova_compute[183083]: 2026-01-26 09:31:47.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:31:47 compute-1 nova_compute[183083]: 2026-01-26 09:31:47.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:31:47 compute-1 nova_compute[183083]: 2026-01-26 09:31:47.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:31:48 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:31:48.751 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:31:48 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:31:48.752 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:31:48 compute-1 nova_compute[183083]: 2026-01-26 09:31:48.752 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:31:48 compute-1 nova_compute[183083]: 2026-01-26 09:31:48.946 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:31:49 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:31:49.754 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:31:51 compute-1 nova_compute[183083]: 2026-01-26 09:31:51.620 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:31:53 compute-1 podman[232884]: 2026-01-26 09:31:53.824353639 +0000 UTC m=+0.086862465 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 09:31:56 compute-1 nova_compute[183083]: 2026-01-26 09:31:56.622 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:31:56 compute-1 nova_compute[183083]: 2026-01-26 09:31:56.624 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:31:56 compute-1 nova_compute[183083]: 2026-01-26 09:31:56.624 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:31:56 compute-1 nova_compute[183083]: 2026-01-26 09:31:56.624 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:31:56 compute-1 nova_compute[183083]: 2026-01-26 09:31:56.657 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:31:56 compute-1 nova_compute[183083]: 2026-01-26 09:31:56.658 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:31:57 compute-1 nova_compute[183083]: 2026-01-26 09:31:57.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:31:58 compute-1 nova_compute[183083]: 2026-01-26 09:31:58.001 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:31:58 compute-1 nova_compute[183083]: 2026-01-26 09:31:58.002 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:31:58 compute-1 nova_compute[183083]: 2026-01-26 09:31:58.002 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:31:58 compute-1 nova_compute[183083]: 2026-01-26 09:31:58.003 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:31:58 compute-1 nova_compute[183083]: 2026-01-26 09:31:58.237 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:31:58 compute-1 nova_compute[183083]: 2026-01-26 09:31:58.238 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13635MB free_disk=113.08374786376953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:31:58 compute-1 nova_compute[183083]: 2026-01-26 09:31:58.239 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:31:58 compute-1 nova_compute[183083]: 2026-01-26 09:31:58.239 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:31:58 compute-1 nova_compute[183083]: 2026-01-26 09:31:58.346 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:31:58 compute-1 nova_compute[183083]: 2026-01-26 09:31:58.346 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:31:58 compute-1 nova_compute[183083]: 2026-01-26 09:31:58.385 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:31:58 compute-1 nova_compute[183083]: 2026-01-26 09:31:58.408 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:31:58 compute-1 nova_compute[183083]: 2026-01-26 09:31:58.409 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:31:58 compute-1 nova_compute[183083]: 2026-01-26 09:31:58.410 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:32:01 compute-1 nova_compute[183083]: 2026-01-26 09:32:01.679 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:32:01 compute-1 nova_compute[183083]: 2026-01-26 09:32:01.681 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:32:01 compute-1 nova_compute[183083]: 2026-01-26 09:32:01.681 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5023 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:32:01 compute-1 nova_compute[183083]: 2026-01-26 09:32:01.681 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:32:01 compute-1 nova_compute[183083]: 2026-01-26 09:32:01.682 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:32:01 compute-1 nova_compute[183083]: 2026-01-26 09:32:01.682 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:32:03.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:32:03 compute-1 ovn_controller[95352]: 2026-01-26T09:32:03Z|00396|pinctrl|WARN|Dropped 167 log messages in last 60 seconds (most recently, 5 seconds ago) due to excessive rate
Jan 26 09:32:03 compute-1 ovn_controller[95352]: 2026-01-26T09:32:03Z|00397|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:32:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:32:05.354 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:32:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:32:05.354 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:32:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:32:05.355 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:32:06 compute-1 nova_compute[183083]: 2026-01-26 09:32:06.683 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:32:07 compute-1 sshd-session[232908]: Connection closed by authenticating user root 178.62.249.31 port 58532 [preauth]
Jan 26 09:32:10 compute-1 podman[232911]: 2026-01-26 09:32:10.813404698 +0000 UTC m=+0.073733704 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 09:32:10 compute-1 podman[232918]: 2026-01-26 09:32:10.834426352 +0000 UTC m=+0.081657148 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 26 09:32:10 compute-1 podman[232924]: 2026-01-26 09:32:10.836036987 +0000 UTC m=+0.079082705 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 09:32:10 compute-1 podman[232910]: 2026-01-26 09:32:10.837858869 +0000 UTC m=+0.104037990 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 26 09:32:10 compute-1 podman[232912]: 2026-01-26 09:32:10.858636846 +0000 UTC m=+0.105300076 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=minimal rhel9, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Jan 26 09:32:11 compute-1 nova_compute[183083]: 2026-01-26 09:32:11.685 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:32:16 compute-1 sshd-session[233017]: Invalid user sol from 2.57.122.238 port 59498
Jan 26 09:32:16 compute-1 nova_compute[183083]: 2026-01-26 09:32:16.687 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:32:16 compute-1 sshd-session[233017]: Connection closed by invalid user sol 2.57.122.238 port 59498 [preauth]
Jan 26 09:32:21 compute-1 nova_compute[183083]: 2026-01-26 09:32:21.689 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:32:21 compute-1 nova_compute[183083]: 2026-01-26 09:32:21.690 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:32:21 compute-1 nova_compute[183083]: 2026-01-26 09:32:21.690 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:32:21 compute-1 nova_compute[183083]: 2026-01-26 09:32:21.690 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:32:21 compute-1 nova_compute[183083]: 2026-01-26 09:32:21.691 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:32:21 compute-1 nova_compute[183083]: 2026-01-26 09:32:21.691 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:32:24 compute-1 podman[233019]: 2026-01-26 09:32:24.826290958 +0000 UTC m=+0.078504219 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 26 09:32:26 compute-1 nova_compute[183083]: 2026-01-26 09:32:26.691 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:32:27 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:32:27.831 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:32:27 compute-1 nova_compute[183083]: 2026-01-26 09:32:27.832 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:32:27 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:32:27.833 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:32:27 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:32:27.833 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:32:31 compute-1 nova_compute[183083]: 2026-01-26 09:32:31.695 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:32:36 compute-1 nova_compute[183083]: 2026-01-26 09:32:36.696 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:32:38 compute-1 nova_compute[183083]: 2026-01-26 09:32:38.406 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:32:38 compute-1 nova_compute[183083]: 2026-01-26 09:32:38.407 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:32:38 compute-1 nova_compute[183083]: 2026-01-26 09:32:38.407 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:32:38 compute-1 nova_compute[183083]: 2026-01-26 09:32:38.407 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:32:38 compute-1 nova_compute[183083]: 2026-01-26 09:32:38.444 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:32:41 compute-1 nova_compute[183083]: 2026-01-26 09:32:41.698 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:32:41 compute-1 nova_compute[183083]: 2026-01-26 09:32:41.700 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:32:41 compute-1 nova_compute[183083]: 2026-01-26 09:32:41.700 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:32:41 compute-1 nova_compute[183083]: 2026-01-26 09:32:41.700 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:32:41 compute-1 nova_compute[183083]: 2026-01-26 09:32:41.757 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:32:41 compute-1 nova_compute[183083]: 2026-01-26 09:32:41.757 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:32:41 compute-1 podman[233045]: 2026-01-26 09:32:41.854963964 +0000 UTC m=+0.070858133 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, managed_by=edpm_ansible, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Jan 26 09:32:41 compute-1 podman[233044]: 2026-01-26 09:32:41.870629987 +0000 UTC m=+0.077236634 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 09:32:41 compute-1 podman[233046]: 2026-01-26 09:32:41.87676848 +0000 UTC m=+0.084156899 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 09:32:41 compute-1 podman[233047]: 2026-01-26 09:32:41.888027978 +0000 UTC m=+0.084295233 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 09:32:41 compute-1 podman[233043]: 2026-01-26 09:32:41.907258281 +0000 UTC m=+0.121484273 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 26 09:32:41 compute-1 nova_compute[183083]: 2026-01-26 09:32:41.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:32:43 compute-1 nova_compute[183083]: 2026-01-26 09:32:43.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:32:43 compute-1 nova_compute[183083]: 2026-01-26 09:32:43.953 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:32:46 compute-1 nova_compute[183083]: 2026-01-26 09:32:46.758 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:32:47 compute-1 nova_compute[183083]: 2026-01-26 09:32:47.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:32:47 compute-1 nova_compute[183083]: 2026-01-26 09:32:47.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:32:47 compute-1 nova_compute[183083]: 2026-01-26 09:32:47.953 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 26 09:32:47 compute-1 nova_compute[183083]: 2026-01-26 09:32:47.969 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 26 09:32:49 compute-1 nova_compute[183083]: 2026-01-26 09:32:49.968 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:32:49 compute-1 nova_compute[183083]: 2026-01-26 09:32:49.969 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:32:49 compute-1 nova_compute[183083]: 2026-01-26 09:32:49.969 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:32:51 compute-1 nova_compute[183083]: 2026-01-26 09:32:51.760 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:32:51 compute-1 nova_compute[183083]: 2026-01-26 09:32:51.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:32:51 compute-1 nova_compute[183083]: 2026-01-26 09:32:51.953 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 26 09:32:55 compute-1 podman[233153]: 2026-01-26 09:32:55.821195278 +0000 UTC m=+0.083830199 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 09:32:55 compute-1 sshd-session[233151]: Connection closed by authenticating user root 178.62.249.31 port 48832 [preauth]
Jan 26 09:32:56 compute-1 nova_compute[183083]: 2026-01-26 09:32:56.762 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:32:59 compute-1 nova_compute[183083]: 2026-01-26 09:32:59.966 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:33:00 compute-1 nova_compute[183083]: 2026-01-26 09:33:00.000 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:33:00 compute-1 nova_compute[183083]: 2026-01-26 09:33:00.001 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:33:00 compute-1 nova_compute[183083]: 2026-01-26 09:33:00.001 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:33:00 compute-1 nova_compute[183083]: 2026-01-26 09:33:00.001 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:33:00 compute-1 nova_compute[183083]: 2026-01-26 09:33:00.178 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:33:00 compute-1 nova_compute[183083]: 2026-01-26 09:33:00.179 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13647MB free_disk=113.08374786376953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:33:00 compute-1 nova_compute[183083]: 2026-01-26 09:33:00.180 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:33:00 compute-1 nova_compute[183083]: 2026-01-26 09:33:00.180 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:33:00 compute-1 nova_compute[183083]: 2026-01-26 09:33:00.356 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:33:00 compute-1 nova_compute[183083]: 2026-01-26 09:33:00.357 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:33:00 compute-1 nova_compute[183083]: 2026-01-26 09:33:00.380 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:33:00 compute-1 nova_compute[183083]: 2026-01-26 09:33:00.391 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:33:00 compute-1 nova_compute[183083]: 2026-01-26 09:33:00.393 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:33:00 compute-1 nova_compute[183083]: 2026-01-26 09:33:00.393 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:33:01 compute-1 nova_compute[183083]: 2026-01-26 09:33:01.764 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:33:04 compute-1 ovn_controller[95352]: 2026-01-26T09:33:04Z|00398|pinctrl|WARN|Dropped 175 log messages in last 60 seconds (most recently, 3 seconds ago) due to excessive rate
Jan 26 09:33:04 compute-1 ovn_controller[95352]: 2026-01-26T09:33:04Z|00399|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:33:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:33:05.354 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:33:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:33:05.355 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:33:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:33:05.355 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:33:06 compute-1 nova_compute[183083]: 2026-01-26 09:33:06.766 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:33:11 compute-1 nova_compute[183083]: 2026-01-26 09:33:11.768 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:33:11 compute-1 nova_compute[183083]: 2026-01-26 09:33:11.770 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:33:11 compute-1 nova_compute[183083]: 2026-01-26 09:33:11.770 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:33:11 compute-1 nova_compute[183083]: 2026-01-26 09:33:11.770 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:33:11 compute-1 nova_compute[183083]: 2026-01-26 09:33:11.771 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:33:11 compute-1 nova_compute[183083]: 2026-01-26 09:33:11.772 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:33:11 compute-1 nova_compute[183083]: 2026-01-26 09:33:11.837 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:33:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:33:11.837 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:33:11 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:33:11.839 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:33:12 compute-1 podman[233179]: 2026-01-26 09:33:12.868241123 +0000 UTC m=+0.101632112 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 26 09:33:12 compute-1 podman[233181]: 2026-01-26 09:33:12.868363847 +0000 UTC m=+0.092941537 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 26 09:33:12 compute-1 podman[233180]: 2026-01-26 09:33:12.874203902 +0000 UTC m=+0.104567126 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, maintainer=Red Hat, Inc.)
Jan 26 09:33:12 compute-1 podman[233187]: 2026-01-26 09:33:12.882204128 +0000 UTC m=+0.101606482 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 09:33:12 compute-1 podman[233178]: 2026-01-26 09:33:12.893384253 +0000 UTC m=+0.148716172 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:33:13 compute-1 nova_compute[183083]: 2026-01-26 09:33:13.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:33:14 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:33:14.841 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:33:16 compute-1 nova_compute[183083]: 2026-01-26 09:33:16.772 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:33:21 compute-1 nova_compute[183083]: 2026-01-26 09:33:21.774 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:33:26 compute-1 nova_compute[183083]: 2026-01-26 09:33:26.776 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:33:26 compute-1 nova_compute[183083]: 2026-01-26 09:33:26.777 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:33:26 compute-1 nova_compute[183083]: 2026-01-26 09:33:26.777 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:33:26 compute-1 nova_compute[183083]: 2026-01-26 09:33:26.777 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:33:26 compute-1 nova_compute[183083]: 2026-01-26 09:33:26.778 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:33:26 compute-1 nova_compute[183083]: 2026-01-26 09:33:26.779 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:33:26 compute-1 podman[233283]: 2026-01-26 09:33:26.829205447 +0000 UTC m=+0.081974127 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:33:31 compute-1 nova_compute[183083]: 2026-01-26 09:33:31.780 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:33:31 compute-1 nova_compute[183083]: 2026-01-26 09:33:31.782 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:33:31 compute-1 nova_compute[183083]: 2026-01-26 09:33:31.783 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:33:31 compute-1 nova_compute[183083]: 2026-01-26 09:33:31.783 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:33:31 compute-1 nova_compute[183083]: 2026-01-26 09:33:31.819 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:33:31 compute-1 nova_compute[183083]: 2026-01-26 09:33:31.820 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:33:36 compute-1 nova_compute[183083]: 2026-01-26 09:33:36.821 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:33:38 compute-1 nova_compute[183083]: 2026-01-26 09:33:38.109 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:33:38 compute-1 nova_compute[183083]: 2026-01-26 09:33:38.110 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:33:38 compute-1 nova_compute[183083]: 2026-01-26 09:33:38.110 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:33:38 compute-1 nova_compute[183083]: 2026-01-26 09:33:38.125 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:33:38 compute-1 nova_compute[183083]: 2026-01-26 09:33:38.962 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:33:41 compute-1 nova_compute[183083]: 2026-01-26 09:33:41.823 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:33:42 compute-1 nova_compute[183083]: 2026-01-26 09:33:42.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:33:43 compute-1 podman[233319]: 2026-01-26 09:33:43.828608158 +0000 UTC m=+0.074161376 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:33:43 compute-1 podman[233312]: 2026-01-26 09:33:43.828730201 +0000 UTC m=+0.078485358 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, version=9.6, managed_by=edpm_ansible, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 09:33:43 compute-1 podman[233311]: 2026-01-26 09:33:43.85025682 +0000 UTC m=+0.099551824 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 26 09:33:43 compute-1 podman[233313]: 2026-01-26 09:33:43.864993776 +0000 UTC m=+0.104968147 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 09:33:43 compute-1 podman[233310]: 2026-01-26 09:33:43.873316781 +0000 UTC m=+0.133286837 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 09:33:43 compute-1 nova_compute[183083]: 2026-01-26 09:33:43.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:33:44 compute-1 sshd-session[233308]: Connection closed by authenticating user root 178.62.249.31 port 55934 [preauth]
Jan 26 09:33:45 compute-1 nova_compute[183083]: 2026-01-26 09:33:45.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:33:46 compute-1 nova_compute[183083]: 2026-01-26 09:33:46.825 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:33:46 compute-1 nova_compute[183083]: 2026-01-26 09:33:46.827 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:33:46 compute-1 nova_compute[183083]: 2026-01-26 09:33:46.827 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:33:46 compute-1 nova_compute[183083]: 2026-01-26 09:33:46.828 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:33:46 compute-1 nova_compute[183083]: 2026-01-26 09:33:46.828 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:33:46 compute-1 nova_compute[183083]: 2026-01-26 09:33:46.831 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:33:48 compute-1 nova_compute[183083]: 2026-01-26 09:33:48.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:33:49 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:33:49.894 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:33:49 compute-1 nova_compute[183083]: 2026-01-26 09:33:49.895 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:33:49 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:33:49.896 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:33:50 compute-1 nova_compute[183083]: 2026-01-26 09:33:50.953 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:33:50 compute-1 nova_compute[183083]: 2026-01-26 09:33:50.954 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:33:50 compute-1 nova_compute[183083]: 2026-01-26 09:33:50.954 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:33:51 compute-1 nova_compute[183083]: 2026-01-26 09:33:51.828 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:33:51 compute-1 nova_compute[183083]: 2026-01-26 09:33:51.830 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:33:53 compute-1 nova_compute[183083]: 2026-01-26 09:33:53.947 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:33:54 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:33:54.899 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:33:56 compute-1 nova_compute[183083]: 2026-01-26 09:33:56.830 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:33:57 compute-1 podman[233416]: 2026-01-26 09:33:57.830948242 +0000 UTC m=+0.083547751 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 09:33:59 compute-1 nova_compute[183083]: 2026-01-26 09:33:59.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:33:59 compute-1 nova_compute[183083]: 2026-01-26 09:33:59.986 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:33:59 compute-1 nova_compute[183083]: 2026-01-26 09:33:59.987 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:33:59 compute-1 nova_compute[183083]: 2026-01-26 09:33:59.987 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:33:59 compute-1 nova_compute[183083]: 2026-01-26 09:33:59.988 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:34:00 compute-1 nova_compute[183083]: 2026-01-26 09:34:00.261 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:34:00 compute-1 nova_compute[183083]: 2026-01-26 09:34:00.263 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13646MB free_disk=113.08374786376953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:34:00 compute-1 nova_compute[183083]: 2026-01-26 09:34:00.264 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:34:00 compute-1 nova_compute[183083]: 2026-01-26 09:34:00.264 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:34:00 compute-1 nova_compute[183083]: 2026-01-26 09:34:00.351 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:34:00 compute-1 nova_compute[183083]: 2026-01-26 09:34:00.351 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:34:00 compute-1 nova_compute[183083]: 2026-01-26 09:34:00.580 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:34:00 compute-1 nova_compute[183083]: 2026-01-26 09:34:00.598 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:34:00 compute-1 nova_compute[183083]: 2026-01-26 09:34:00.600 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:34:00 compute-1 nova_compute[183083]: 2026-01-26 09:34:00.601 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:34:01 compute-1 nova_compute[183083]: 2026-01-26 09:34:01.832 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:34:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:34:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:34:05.355 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:34:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:34:05.356 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:34:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:34:05.356 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:34:05 compute-1 ovn_controller[95352]: 2026-01-26T09:34:05Z|00400|pinctrl|WARN|Dropped 175 log messages in last 62 seconds (most recently, 6 seconds ago) due to excessive rate
Jan 26 09:34:05 compute-1 ovn_controller[95352]: 2026-01-26T09:34:05Z|00401|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:34:06 compute-1 nova_compute[183083]: 2026-01-26 09:34:06.834 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:34:06 compute-1 nova_compute[183083]: 2026-01-26 09:34:06.835 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:34:06 compute-1 nova_compute[183083]: 2026-01-26 09:34:06.836 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:34:06 compute-1 nova_compute[183083]: 2026-01-26 09:34:06.836 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:34:06 compute-1 nova_compute[183083]: 2026-01-26 09:34:06.836 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:34:06 compute-1 nova_compute[183083]: 2026-01-26 09:34:06.838 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:34:11 compute-1 nova_compute[183083]: 2026-01-26 09:34:11.836 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:34:11 compute-1 nova_compute[183083]: 2026-01-26 09:34:11.841 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:34:14 compute-1 podman[233441]: 2026-01-26 09:34:14.813245067 +0000 UTC m=+0.068822966 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute)
Jan 26 09:34:14 compute-1 podman[233443]: 2026-01-26 09:34:14.817370623 +0000 UTC m=+0.065714517 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 26 09:34:14 compute-1 podman[233449]: 2026-01-26 09:34:14.826160072 +0000 UTC m=+0.070117642 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 09:34:14 compute-1 podman[233442]: 2026-01-26 09:34:14.829495156 +0000 UTC m=+0.080563327 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, vcs-type=git, name=ubi9-minimal, version=9.6, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9)
Jan 26 09:34:14 compute-1 podman[233440]: 2026-01-26 09:34:14.858066123 +0000 UTC m=+0.117108340 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 26 09:34:16 compute-1 nova_compute[183083]: 2026-01-26 09:34:16.840 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:34:17 compute-1 sshd-session[233547]: Invalid user sol from 2.57.122.238 port 58688
Jan 26 09:34:18 compute-1 sshd-session[233547]: Connection closed by invalid user sol 2.57.122.238 port 58688 [preauth]
Jan 26 09:34:21 compute-1 nova_compute[183083]: 2026-01-26 09:34:21.843 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:34:21 compute-1 nova_compute[183083]: 2026-01-26 09:34:21.845 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:34:21 compute-1 nova_compute[183083]: 2026-01-26 09:34:21.846 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:34:21 compute-1 nova_compute[183083]: 2026-01-26 09:34:21.846 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:34:21 compute-1 nova_compute[183083]: 2026-01-26 09:34:21.847 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:34:21 compute-1 nova_compute[183083]: 2026-01-26 09:34:21.849 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:34:26 compute-1 nova_compute[183083]: 2026-01-26 09:34:26.847 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:34:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:34:26.955 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:34:26 compute-1 nova_compute[183083]: 2026-01-26 09:34:26.956 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:34:26 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:34:26.957 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:34:28 compute-1 podman[233549]: 2026-01-26 09:34:28.843406226 +0000 UTC m=+0.092708980 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 09:34:31 compute-1 nova_compute[183083]: 2026-01-26 09:34:31.849 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:34:33 compute-1 sshd-session[233572]: Connection closed by authenticating user root 178.62.249.31 port 34458 [preauth]
Jan 26 09:34:33 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:34:33.960 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:34:36 compute-1 nova_compute[183083]: 2026-01-26 09:34:36.851 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:34:38 compute-1 nova_compute[183083]: 2026-01-26 09:34:38.602 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:34:38 compute-1 nova_compute[183083]: 2026-01-26 09:34:38.603 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:34:38 compute-1 nova_compute[183083]: 2026-01-26 09:34:38.603 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:34:38 compute-1 nova_compute[183083]: 2026-01-26 09:34:38.619 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:34:39 compute-1 nova_compute[183083]: 2026-01-26 09:34:39.963 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:34:41 compute-1 nova_compute[183083]: 2026-01-26 09:34:41.853 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:34:41 compute-1 nova_compute[183083]: 2026-01-26 09:34:41.854 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:34:41 compute-1 nova_compute[183083]: 2026-01-26 09:34:41.854 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:34:41 compute-1 nova_compute[183083]: 2026-01-26 09:34:41.855 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:34:41 compute-1 nova_compute[183083]: 2026-01-26 09:34:41.855 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:34:41 compute-1 nova_compute[183083]: 2026-01-26 09:34:41.857 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:34:43 compute-1 nova_compute[183083]: 2026-01-26 09:34:43.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:34:44 compute-1 nova_compute[183083]: 2026-01-26 09:34:44.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:34:45 compute-1 podman[233583]: 2026-01-26 09:34:45.813018237 +0000 UTC m=+0.059982366 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:34:45 compute-1 podman[233577]: 2026-01-26 09:34:45.824189762 +0000 UTC m=+0.070529763 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 09:34:45 compute-1 podman[233575]: 2026-01-26 09:34:45.825777077 +0000 UTC m=+0.086336890 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 09:34:45 compute-1 podman[233576]: 2026-01-26 09:34:45.852517803 +0000 UTC m=+0.108186598 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, version=9.6, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container)
Jan 26 09:34:45 compute-1 podman[233574]: 2026-01-26 09:34:45.857229936 +0000 UTC m=+0.123408998 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 26 09:34:46 compute-1 nova_compute[183083]: 2026-01-26 09:34:46.856 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:34:46 compute-1 nova_compute[183083]: 2026-01-26 09:34:46.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:34:49 compute-1 nova_compute[183083]: 2026-01-26 09:34:49.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:34:50 compute-1 nova_compute[183083]: 2026-01-26 09:34:50.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:34:51 compute-1 nova_compute[183083]: 2026-01-26 09:34:51.859 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:34:52 compute-1 nova_compute[183083]: 2026-01-26 09:34:52.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:34:52 compute-1 nova_compute[183083]: 2026-01-26 09:34:52.951 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:34:56 compute-1 nova_compute[183083]: 2026-01-26 09:34:56.861 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:34:59 compute-1 podman[233677]: 2026-01-26 09:34:59.802298771 +0000 UTC m=+0.072049747 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 26 09:35:00 compute-1 nova_compute[183083]: 2026-01-26 09:35:00.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:35:00 compute-1 nova_compute[183083]: 2026-01-26 09:35:00.978 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:35:00 compute-1 nova_compute[183083]: 2026-01-26 09:35:00.978 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:35:00 compute-1 nova_compute[183083]: 2026-01-26 09:35:00.979 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:35:00 compute-1 nova_compute[183083]: 2026-01-26 09:35:00.979 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:35:01 compute-1 nova_compute[183083]: 2026-01-26 09:35:01.177 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:35:01 compute-1 nova_compute[183083]: 2026-01-26 09:35:01.178 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13640MB free_disk=113.08374786376953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:35:01 compute-1 nova_compute[183083]: 2026-01-26 09:35:01.179 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:35:01 compute-1 nova_compute[183083]: 2026-01-26 09:35:01.179 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:35:01 compute-1 nova_compute[183083]: 2026-01-26 09:35:01.863 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:35:01 compute-1 nova_compute[183083]: 2026-01-26 09:35:01.865 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:35:01 compute-1 nova_compute[183083]: 2026-01-26 09:35:01.866 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:35:01 compute-1 nova_compute[183083]: 2026-01-26 09:35:01.866 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:35:01 compute-1 nova_compute[183083]: 2026-01-26 09:35:01.867 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:35:01 compute-1 nova_compute[183083]: 2026-01-26 09:35:01.868 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:35:02 compute-1 nova_compute[183083]: 2026-01-26 09:35:02.302 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:35:02 compute-1 nova_compute[183083]: 2026-01-26 09:35:02.302 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:35:02 compute-1 nova_compute[183083]: 2026-01-26 09:35:02.808 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:35:02 compute-1 nova_compute[183083]: 2026-01-26 09:35:02.829 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:35:02 compute-1 nova_compute[183083]: 2026-01-26 09:35:02.831 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:35:02 compute-1 nova_compute[183083]: 2026-01-26 09:35:02.832 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:35:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:35:05.356 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:35:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:35:05.357 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:35:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:35:05.357 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:35:06 compute-1 nova_compute[183083]: 2026-01-26 09:35:06.867 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:35:07 compute-1 ovn_controller[95352]: 2026-01-26T09:35:07Z|00402|pinctrl|WARN|Dropped 195 log messages in last 62 seconds (most recently, 4 seconds ago) due to excessive rate
Jan 26 09:35:07 compute-1 ovn_controller[95352]: 2026-01-26T09:35:07Z|00403|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:35:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:35:07.897 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:35:07 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:35:07.898 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:35:07 compute-1 nova_compute[183083]: 2026-01-26 09:35:07.924 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:35:09 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:35:09.901 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:35:11 compute-1 nova_compute[183083]: 2026-01-26 09:35:11.870 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:35:16 compute-1 podman[233704]: 2026-01-26 09:35:16.846017083 +0000 UTC m=+0.096324313 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter)
Jan 26 09:35:16 compute-1 podman[233705]: 2026-01-26 09:35:16.853318149 +0000 UTC m=+0.091200458 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 26 09:35:16 compute-1 podman[233711]: 2026-01-26 09:35:16.858543926 +0000 UTC m=+0.090425005 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 09:35:16 compute-1 podman[233703]: 2026-01-26 09:35:16.863847866 +0000 UTC m=+0.110385099 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 26 09:35:16 compute-1 nova_compute[183083]: 2026-01-26 09:35:16.872 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:35:16 compute-1 nova_compute[183083]: 2026-01-26 09:35:16.873 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:35:16 compute-1 podman[233702]: 2026-01-26 09:35:16.880503657 +0000 UTC m=+0.136872298 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 09:35:21 compute-1 nova_compute[183083]: 2026-01-26 09:35:21.874 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:35:23 compute-1 sshd-session[233804]: Connection closed by authenticating user root 178.62.249.31 port 47082 [preauth]
Jan 26 09:35:26 compute-1 nova_compute[183083]: 2026-01-26 09:35:26.875 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:35:30 compute-1 podman[233806]: 2026-01-26 09:35:30.840348479 +0000 UTC m=+0.090217709 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 26 09:35:31 compute-1 nova_compute[183083]: 2026-01-26 09:35:31.876 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:35:31 compute-1 nova_compute[183083]: 2026-01-26 09:35:31.878 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:35:36 compute-1 nova_compute[183083]: 2026-01-26 09:35:36.880 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:35:36 compute-1 nova_compute[183083]: 2026-01-26 09:35:36.882 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:35:36 compute-1 nova_compute[183083]: 2026-01-26 09:35:36.882 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:35:36 compute-1 nova_compute[183083]: 2026-01-26 09:35:36.882 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:35:36 compute-1 nova_compute[183083]: 2026-01-26 09:35:36.911 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:35:36 compute-1 nova_compute[183083]: 2026-01-26 09:35:36.911 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:35:41 compute-1 nova_compute[183083]: 2026-01-26 09:35:41.827 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:35:41 compute-1 nova_compute[183083]: 2026-01-26 09:35:41.827 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:35:41 compute-1 nova_compute[183083]: 2026-01-26 09:35:41.828 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:35:41 compute-1 nova_compute[183083]: 2026-01-26 09:35:41.828 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:35:41 compute-1 nova_compute[183083]: 2026-01-26 09:35:41.841 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:35:41 compute-1 nova_compute[183083]: 2026-01-26 09:35:41.912 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:35:45 compute-1 nova_compute[183083]: 2026-01-26 09:35:45.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:35:46 compute-1 nova_compute[183083]: 2026-01-26 09:35:46.913 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:35:46 compute-1 nova_compute[183083]: 2026-01-26 09:35:46.916 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:35:46 compute-1 nova_compute[183083]: 2026-01-26 09:35:46.916 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:35:46 compute-1 nova_compute[183083]: 2026-01-26 09:35:46.916 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:35:46 compute-1 nova_compute[183083]: 2026-01-26 09:35:46.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:35:46 compute-1 nova_compute[183083]: 2026-01-26 09:35:46.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:35:46 compute-1 nova_compute[183083]: 2026-01-26 09:35:46.971 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:35:46 compute-1 nova_compute[183083]: 2026-01-26 09:35:46.972 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:35:47 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:35:47.596 104632 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:72:53', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0a:09:68:d0:2c:54'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 26 09:35:47 compute-1 nova_compute[183083]: 2026-01-26 09:35:47.596 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:35:47 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:35:47.598 104632 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 26 09:35:47 compute-1 podman[233833]: 2026-01-26 09:35:47.941118223 +0000 UTC m=+0.066635074 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter)
Jan 26 09:35:47 compute-1 podman[233835]: 2026-01-26 09:35:47.950477287 +0000 UTC m=+0.065438590 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 09:35:47 compute-1 podman[233832]: 2026-01-26 09:35:47.979115846 +0000 UTC m=+0.095595492 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 09:35:47 compute-1 podman[233834]: 2026-01-26 09:35:47.997940668 +0000 UTC m=+0.104988177 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 26 09:35:48 compute-1 podman[233831]: 2026-01-26 09:35:48.046267473 +0000 UTC m=+0.172714720 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Jan 26 09:35:50 compute-1 nova_compute[183083]: 2026-01-26 09:35:50.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:35:50 compute-1 nova_compute[183083]: 2026-01-26 09:35:50.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:35:51 compute-1 nova_compute[183083]: 2026-01-26 09:35:51.973 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:35:52 compute-1 nova_compute[183083]: 2026-01-26 09:35:52.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:35:52 compute-1 nova_compute[183083]: 2026-01-26 09:35:52.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:35:56 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:35:56.601 104632 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f671d48-fb23-4421-893d-f2ec1411c819, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 09:35:56 compute-1 nova_compute[183083]: 2026-01-26 09:35:56.948 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:35:56 compute-1 nova_compute[183083]: 2026-01-26 09:35:56.974 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:35:56 compute-1 nova_compute[183083]: 2026-01-26 09:35:56.976 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:36:01 compute-1 podman[233937]: 2026-01-26 09:36:01.836683259 +0000 UTC m=+0.090013344 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:36:01 compute-1 nova_compute[183083]: 2026-01-26 09:36:01.976 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:36:02 compute-1 nova_compute[183083]: 2026-01-26 09:36:02.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:36:02 compute-1 nova_compute[183083]: 2026-01-26 09:36:02.977 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:36:02 compute-1 nova_compute[183083]: 2026-01-26 09:36:02.978 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:36:02 compute-1 nova_compute[183083]: 2026-01-26 09:36:02.978 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:36:02 compute-1 nova_compute[183083]: 2026-01-26 09:36:02.979 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:36:03 compute-1 nova_compute[183083]: 2026-01-26 09:36:03.173 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:36:03 compute-1 nova_compute[183083]: 2026-01-26 09:36:03.175 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13650MB free_disk=113.0837287902832GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:36:03 compute-1 nova_compute[183083]: 2026-01-26 09:36:03.175 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:36:03 compute-1 nova_compute[183083]: 2026-01-26 09:36:03.175 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:36:03 compute-1 nova_compute[183083]: 2026-01-26 09:36:03.443 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:36:03 compute-1 nova_compute[183083]: 2026-01-26 09:36:03.444 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:36:03 compute-1 nova_compute[183083]: 2026-01-26 09:36:03.481 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing inventories for resource provider 5203935e-446c-4e03-93fa-4c60d651e045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 26 09:36:03 compute-1 nova_compute[183083]: 2026-01-26 09:36:03.527 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating ProviderTree inventory for provider 5203935e-446c-4e03-93fa-4c60d651e045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 26 09:36:03 compute-1 nova_compute[183083]: 2026-01-26 09:36:03.528 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Updating inventory in ProviderTree for provider 5203935e-446c-4e03-93fa-4c60d651e045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 26 09:36:03 compute-1 nova_compute[183083]: 2026-01-26 09:36:03.561 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing aggregate associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 26 09:36:03 compute-1 nova_compute[183083]: 2026-01-26 09:36:03.590 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Refreshing trait associations for resource provider 5203935e-446c-4e03-93fa-4c60d651e045, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 26 09:36:03 compute-1 sshd-session[220012]: Connection closed by 38.102.83.66 port 42482
Jan 26 09:36:03 compute-1 sshd-session[221586]: Connection closed by 38.102.83.66 port 50638
Jan 26 09:36:03 compute-1 sshd-session[222804]: Connection closed by 38.102.83.66 port 38992
Jan 26 09:36:03 compute-1 sshd-session[220782]: Connection closed by 38.102.83.66 port 40512
Jan 26 09:36:03 compute-1 sshd-session[224522]: Connection closed by 38.102.83.66 port 55022
Jan 26 09:36:03 compute-1 sshd-session[225860]: Connection closed by 38.102.83.66 port 40082
Jan 26 09:36:03 compute-1 sshd-session[220239]: Connection closed by 38.102.83.66 port 47044
Jan 26 09:36:03 compute-1 sshd-session[226997]: Connection closed by 38.102.83.66 port 49320
Jan 26 09:36:03 compute-1 sshd-session[219928]: Connection closed by 38.102.83.66 port 42392
Jan 26 09:36:03 compute-1 sshd-session[222430]: Connection closed by 38.102.83.66 port 46626
Jan 26 09:36:03 compute-1 sshd-session[226881]: Connection closed by 38.102.83.66 port 44204
Jan 26 09:36:03 compute-1 sshd-session[222615]: Connection closed by 38.102.83.66 port 54446
Jan 26 09:36:03 compute-1 sshd-session[226041]: Connection closed by 38.102.83.66 port 51974
Jan 26 09:36:03 compute-1 sshd-session[223533]: Connection closed by 38.102.83.66 port 50876
Jan 26 09:36:03 compute-1 sshd-session[219793]: Connection closed by 38.102.83.66 port 37816
Jan 26 09:36:03 compute-1 sshd-session[226935]: Connection closed by 38.102.83.66 port 32994
Jan 26 09:36:03 compute-1 sshd-session[222514]: Connection closed by 38.102.83.66 port 43694
Jan 26 09:36:03 compute-1 sshd-session[224952]: Connection closed by 38.102.83.66 port 38228
Jan 26 09:36:03 compute-1 sshd-session[223967]: Connection closed by 38.102.83.66 port 43704
Jan 26 09:36:03 compute-1 sshd-session[221891]: Connection closed by 38.102.83.66 port 52516
Jan 26 09:36:03 compute-1 sshd-session[220042]: Connection closed by 38.102.83.66 port 40004
Jan 26 09:36:03 compute-1 sshd-session[222544]: Connection closed by 38.102.83.66 port 43706
Jan 26 09:36:03 compute-1 sshd-session[224922]: Connection closed by 38.102.83.66 port 48602
Jan 26 09:36:03 compute-1 sshd-session[221696]: Connection closed by 38.102.83.66 port 46830
Jan 26 09:36:03 compute-1 sshd-session[227214]: Connection closed by 38.102.83.66 port 39738
Jan 26 09:36:03 compute-1 sshd-session[225035]: Connection closed by 38.102.83.66 port 38238
Jan 26 09:36:03 compute-1 sshd-session[221795]: Connection closed by 38.102.83.66 port 46834
Jan 26 09:36:03 compute-1 sshd-session[223715]: Connection closed by 38.102.83.66 port 52818
Jan 26 09:36:03 compute-1 sshd-session[219686]: Connection closed by 38.102.83.66 port 59518
Jan 26 09:36:03 compute-1 sshd-session[226508]: Connection closed by 38.102.83.66 port 36110
Jan 26 09:36:03 compute-1 sshd-session[227130]: Connection closed by 38.102.83.66 port 41572
Jan 26 09:36:03 compute-1 sshd-session[224164]: Connection closed by 38.102.83.66 port 44610
Jan 26 09:36:03 compute-1 nova_compute[183083]: 2026-01-26 09:36:03.623 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:36:03 compute-1 sshd-session[220009]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[220208]: Connection closed by 38.102.83.66 port 51180
Jan 26 09:36:03 compute-1 sshd-session[224414]: Connection closed by 38.102.83.66 port 44992
Jan 26 09:36:03 compute-1 sshd-session[222678]: Connection closed by 38.102.83.66 port 54458
Jan 26 09:36:03 compute-1 sshd-session[227211]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[224519]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[222427]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[223530]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[224919]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[219925]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[226718]: Connection closed by 38.102.83.66 port 37448
Jan 26 09:36:03 compute-1 sshd-session[226994]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[226878]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[225032]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[222801]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[220236]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[221888]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[219780]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[226932]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[225857]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[220039]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[226340]: Connection closed by 38.102.83.66 port 38856
Jan 26 09:36:03 compute-1 sshd-session[224949]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[222600]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[226038]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[219683]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[221583]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[223964]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[223837]: Connection closed by 38.102.83.66 port 43424
Jan 26 09:36:03 compute-1 sshd-session[220779]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 systemd[1]: session-130.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 sshd-session[220333]: Connection closed by 38.102.83.66 port 57988
Jan 26 09:36:03 compute-1 sshd-session[220649]: Connection closed by 38.102.83.66 port 51690
Jan 26 09:36:03 compute-1 systemd[1]: session-48.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-116.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-44.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-56.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-104.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 nova_compute[183083]: 2026-01-26 09:36:03.643 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:36:03 compute-1 systemd[1]: session-42.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 sshd-session[221976]: Connection closed by 38.102.83.66 port 33264
Jan 26 09:36:03 compute-1 nova_compute[183083]: 2026-01-26 09:36:03.645 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:36:03 compute-1 nova_compute[183083]: 2026-01-26 09:36:03.646 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.470s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:36:03 compute-1 systemd[1]: session-93.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-62.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-47.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 sshd-session[221861]: Connection closed by 38.102.83.66 port 52502
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 130 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd[1]: session-124.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 47 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd[1]: session-73.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 sshd-session[223997]: Connection closed by 38.102.83.66 port 43708
Jan 26 09:36:03 compute-1 sshd-session[224754]: Connection closed by 38.102.83.66 port 50202
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 44 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd[1]: session-78.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 116 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd[1]: session-114.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-101.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-125.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-51.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 42 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 124 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd[1]: session-107.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-45.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-127.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-105.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-68.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-81.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-88.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 104 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 62 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 48 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 56 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 93 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 73 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 78 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 114 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 125 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 101 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 51 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 45 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 107 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 68 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 88 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 81 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 127 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 105 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 sshd-session[223712]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[227127]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[221792]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[222511]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[222541]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[221693]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[226505]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 130.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 48.
Jan 26 09:36:03 compute-1 systemd[1]: session-75.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-65.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-90.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 75 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 65 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd[1]: session-120.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-76.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-64.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 90 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd[1]: session-128.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 120 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 76 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 128 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 64 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 116.
Jan 26 09:36:03 compute-1 sshd-session[222668]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[224355]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[220616]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 44.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 56.
Jan 26 09:36:03 compute-1 systemd[1]: session-79.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-54.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 79 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 54 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 104.
Jan 26 09:36:03 compute-1 systemd[1]: session-99.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 42.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 99 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 93.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 62.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 47.
Jan 26 09:36:03 compute-1 sshd-session[220205]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 124.
Jan 26 09:36:03 compute-1 sshd-session[226337]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[224161]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 systemd[1]: session-118.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 sshd-session[221858]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[221973]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 systemd[1]: session-96.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-67.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-70.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 73.
Jan 26 09:36:03 compute-1 systemd[1]: session-50.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 118 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 96 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 70 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 67 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 50 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 sshd-session[223994]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[223807]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 78.
Jan 26 09:36:03 compute-1 sshd-session[220330]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 systemd[1]: session-91.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-94.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-53.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 114.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 94 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 91 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 53 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 101.
Jan 26 09:36:03 compute-1 sshd-session[226715]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 sshd-session[224751]: pam_unix(sshd:session): session closed for user zuul
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 125.
Jan 26 09:36:03 compute-1 systemd[1]: session-102.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 systemd[1]: session-122.scope: Deactivated successfully.
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 51.
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 ceilometer_agent_compute[192784]: 2026-01-26 09:36:03.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 122 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Session 102 logged out. Waiting for processes to exit.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 107.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 45.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 127.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 105.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 68.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 81.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 88.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 75.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 65.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 90.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 120.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 76.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 64.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 128.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 79.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 54.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 99.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 118.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 96.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 67.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 70.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 50.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 91.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 94.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 53.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 102.
Jan 26 09:36:03 compute-1 systemd-logind[788]: Removed session 122.
Jan 26 09:36:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:36:05.358 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:36:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:36:05.358 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:36:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:36:05.359 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:36:06 compute-1 nova_compute[183083]: 2026-01-26 09:36:06.981 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:36:06 compute-1 nova_compute[183083]: 2026-01-26 09:36:06.984 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:36:06 compute-1 nova_compute[183083]: 2026-01-26 09:36:06.984 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5006 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:36:06 compute-1 nova_compute[183083]: 2026-01-26 09:36:06.984 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:36:07 compute-1 nova_compute[183083]: 2026-01-26 09:36:07.025 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:36:07 compute-1 nova_compute[183083]: 2026-01-26 09:36:07.026 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:36:12 compute-1 nova_compute[183083]: 2026-01-26 09:36:12.028 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:36:12 compute-1 nova_compute[183083]: 2026-01-26 09:36:12.032 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:36:12 compute-1 nova_compute[183083]: 2026-01-26 09:36:12.032 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:36:12 compute-1 nova_compute[183083]: 2026-01-26 09:36:12.033 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:36:12 compute-1 nova_compute[183083]: 2026-01-26 09:36:12.064 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:36:12 compute-1 nova_compute[183083]: 2026-01-26 09:36:12.066 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:36:13 compute-1 sshd-session[233962]: Connection closed by authenticating user root 178.62.249.31 port 33364 [preauth]
Jan 26 09:36:13 compute-1 ovn_controller[95352]: 2026-01-26T09:36:13Z|00404|pinctrl|WARN|Dropped 339 log messages in last 66 seconds (most recently, 10 seconds ago) due to excessive rate
Jan 26 09:36:13 compute-1 ovn_controller[95352]: 2026-01-26T09:36:13Z|00405|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Jan 26 09:36:17 compute-1 nova_compute[183083]: 2026-01-26 09:36:17.067 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:36:18 compute-1 podman[233965]: 2026-01-26 09:36:18.84423873 +0000 UTC m=+0.091999530 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 09:36:18 compute-1 podman[233968]: 2026-01-26 09:36:18.844550309 +0000 UTC m=+0.073038075 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 09:36:18 compute-1 podman[233967]: 2026-01-26 09:36:18.856352442 +0000 UTC m=+0.094000476 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 09:36:18 compute-1 podman[233966]: 2026-01-26 09:36:18.870565224 +0000 UTC m=+0.121386871 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6)
Jan 26 09:36:18 compute-1 podman[233964]: 2026-01-26 09:36:18.908336551 +0000 UTC m=+0.157410558 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 09:36:20 compute-1 sshd-session[234069]: Invalid user sol from 2.57.122.238 port 58848
Jan 26 09:36:21 compute-1 sshd-session[234069]: Connection closed by invalid user sol 2.57.122.238 port 58848 [preauth]
Jan 26 09:36:22 compute-1 nova_compute[183083]: 2026-01-26 09:36:22.069 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:36:22 compute-1 nova_compute[183083]: 2026-01-26 09:36:22.072 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:36:22 compute-1 nova_compute[183083]: 2026-01-26 09:36:22.072 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:36:22 compute-1 nova_compute[183083]: 2026-01-26 09:36:22.072 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:36:22 compute-1 nova_compute[183083]: 2026-01-26 09:36:22.102 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:36:22 compute-1 nova_compute[183083]: 2026-01-26 09:36:22.103 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:36:27 compute-1 nova_compute[183083]: 2026-01-26 09:36:27.104 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:36:27 compute-1 nova_compute[183083]: 2026-01-26 09:36:27.106 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:36:27 compute-1 nova_compute[183083]: 2026-01-26 09:36:27.106 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:36:27 compute-1 nova_compute[183083]: 2026-01-26 09:36:27.106 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:36:27 compute-1 nova_compute[183083]: 2026-01-26 09:36:27.145 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:36:27 compute-1 nova_compute[183083]: 2026-01-26 09:36:27.146 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:36:32 compute-1 nova_compute[183083]: 2026-01-26 09:36:32.146 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:36:32 compute-1 nova_compute[183083]: 2026-01-26 09:36:32.148 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:36:32 compute-1 nova_compute[183083]: 2026-01-26 09:36:32.148 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:36:32 compute-1 nova_compute[183083]: 2026-01-26 09:36:32.149 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:36:32 compute-1 nova_compute[183083]: 2026-01-26 09:36:32.184 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:36:32 compute-1 nova_compute[183083]: 2026-01-26 09:36:32.185 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:36:32 compute-1 podman[234071]: 2026-01-26 09:36:32.827359659 +0000 UTC m=+0.082768959 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 26 09:36:37 compute-1 nova_compute[183083]: 2026-01-26 09:36:37.186 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:36:37 compute-1 nova_compute[183083]: 2026-01-26 09:36:37.217 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:36:37 compute-1 nova_compute[183083]: 2026-01-26 09:36:37.218 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5032 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:36:37 compute-1 nova_compute[183083]: 2026-01-26 09:36:37.218 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:36:37 compute-1 nova_compute[183083]: 2026-01-26 09:36:37.218 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:36:40 compute-1 nova_compute[183083]: 2026-01-26 09:36:40.642 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:36:40 compute-1 nova_compute[183083]: 2026-01-26 09:36:40.642 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:36:40 compute-1 nova_compute[183083]: 2026-01-26 09:36:40.643 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 26 09:36:40 compute-1 nova_compute[183083]: 2026-01-26 09:36:40.643 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 26 09:36:40 compute-1 nova_compute[183083]: 2026-01-26 09:36:40.666 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 26 09:36:42 compute-1 nova_compute[183083]: 2026-01-26 09:36:42.250 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:36:42 compute-1 nova_compute[183083]: 2026-01-26 09:36:42.252 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:36:42 compute-1 nova_compute[183083]: 2026-01-26 09:36:42.252 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5032 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:36:42 compute-1 nova_compute[183083]: 2026-01-26 09:36:42.252 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:36:42 compute-1 nova_compute[183083]: 2026-01-26 09:36:42.252 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:36:46 compute-1 nova_compute[183083]: 2026-01-26 09:36:46.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:36:47 compute-1 nova_compute[183083]: 2026-01-26 09:36:47.254 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:36:47 compute-1 nova_compute[183083]: 2026-01-26 09:36:47.291 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:36:47 compute-1 nova_compute[183083]: 2026-01-26 09:36:47.292 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5038 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:36:47 compute-1 nova_compute[183083]: 2026-01-26 09:36:47.292 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:36:47 compute-1 nova_compute[183083]: 2026-01-26 09:36:47.292 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:36:47 compute-1 nova_compute[183083]: 2026-01-26 09:36:47.293 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:36:47 compute-1 nova_compute[183083]: 2026-01-26 09:36:47.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:36:48 compute-1 nova_compute[183083]: 2026-01-26 09:36:48.950 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:36:49 compute-1 podman[234098]: 2026-01-26 09:36:49.807687091 +0000 UTC m=+0.067226420 container health_status 1a259d9372633a90c6c1900ee6ddd68d7838f1e2b418051d1cc2a02699157a67 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 26 09:36:49 compute-1 podman[234099]: 2026-01-26 09:36:49.81756332 +0000 UTC m=+0.076803920 container health_status a008e47842e370c4799d1c9ae6b2864fdd5aa2e624ffb1fe71ea5f0bb077ae7e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter)
Jan 26 09:36:49 compute-1 podman[234101]: 2026-01-26 09:36:49.820049411 +0000 UTC m=+0.070224545 container health_status d5d034a3d0efbd00ce36995760e8dbc4151b8c85e67ba2205b88cf8fb06a1364 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 09:36:49 compute-1 podman[234100]: 2026-01-26 09:36:49.820181694 +0000 UTC m=+0.073677952 container health_status c9ec0704832cb220f06958ba3a2877ca46e8e1074865a3bd3ff177bbdfe5d2ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 26 09:36:49 compute-1 podman[234097]: 2026-01-26 09:36:49.86602694 +0000 UTC m=+0.128771839 container health_status 17ffdd1d9790f3fd2023ac60ec2209417927b2faaf5866e3a5291caeefe674b0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f4efb477b10d5de60ec32e71d7df4e2ac2d562e4dc5889fde33fa69c94ab1896-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 26 09:36:50 compute-1 nova_compute[183083]: 2026-01-26 09:36:50.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:36:50 compute-1 nova_compute[183083]: 2026-01-26 09:36:50.953 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:36:51 compute-1 nova_compute[183083]: 2026-01-26 09:36:51.952 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:36:52 compute-1 nova_compute[183083]: 2026-01-26 09:36:52.294 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:36:52 compute-1 nova_compute[183083]: 2026-01-26 09:36:52.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:36:52 compute-1 nova_compute[183083]: 2026-01-26 09:36:52.952 183087 DEBUG nova.compute.manager [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 26 09:36:57 compute-1 nova_compute[183083]: 2026-01-26 09:36:57.296 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:36:57 compute-1 nova_compute[183083]: 2026-01-26 09:36:57.298 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:36:57 compute-1 nova_compute[183083]: 2026-01-26 09:36:57.299 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:36:57 compute-1 nova_compute[183083]: 2026-01-26 09:36:57.299 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:36:57 compute-1 nova_compute[183083]: 2026-01-26 09:36:57.320 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:36:57 compute-1 nova_compute[183083]: 2026-01-26 09:36:57.321 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:37:00 compute-1 sshd-session[234201]: Connection closed by authenticating user root 178.62.249.31 port 55168 [preauth]
Jan 26 09:37:02 compute-1 nova_compute[183083]: 2026-01-26 09:37:02.322 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:37:02 compute-1 nova_compute[183083]: 2026-01-26 09:37:02.324 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:37:02 compute-1 nova_compute[183083]: 2026-01-26 09:37:02.325 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:37:02 compute-1 nova_compute[183083]: 2026-01-26 09:37:02.325 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:37:02 compute-1 nova_compute[183083]: 2026-01-26 09:37:02.346 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:37:02 compute-1 nova_compute[183083]: 2026-01-26 09:37:02.347 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:37:03 compute-1 sshd-session[234203]: Accepted publickey for zuul from 192.168.122.10 port 45002 ssh2: ECDSA SHA256:450/9vpr0ceOvZVOctbpkKs61WVhLJEBHyM4WO3O6A4
Jan 26 09:37:03 compute-1 systemd-logind[788]: New session 149 of user zuul.
Jan 26 09:37:03 compute-1 systemd[1]: Started Session 149 of User zuul.
Jan 26 09:37:03 compute-1 sshd-session[234203]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 09:37:03 compute-1 podman[234205]: 2026-01-26 09:37:03.13313877 +0000 UTC m=+0.072410407 container health_status 56a044a666a3189db1d05aecadfeac84045e044fc8c908118833f07bb37f1bf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6d577236301a1c208fb2b3c6764e9f3be55901c7d86b2336ddbfca087db2a95e-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 09:37:03 compute-1 sudo[234229]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 26 09:37:03 compute-1 sudo[234229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 09:37:04 compute-1 nova_compute[183083]: 2026-01-26 09:37:04.951 183087 DEBUG oslo_service.periodic_task [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 26 09:37:05 compute-1 nova_compute[183083]: 2026-01-26 09:37:05.268 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:37:05 compute-1 nova_compute[183083]: 2026-01-26 09:37:05.269 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:37:05 compute-1 nova_compute[183083]: 2026-01-26 09:37:05.270 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:37:05 compute-1 nova_compute[183083]: 2026-01-26 09:37:05.270 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 26 09:37:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:37:05.358 104632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:37:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:37:05.359 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:37:05 compute-1 ovn_metadata_agent[104627]: 2026-01-26 09:37:05.359 104632 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:37:05 compute-1 nova_compute[183083]: 2026-01-26 09:37:05.471 183087 WARNING nova.virt.libvirt.driver [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 09:37:05 compute-1 nova_compute[183083]: 2026-01-26 09:37:05.472 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13754MB free_disk=113.08368301391602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 26 09:37:05 compute-1 nova_compute[183083]: 2026-01-26 09:37:05.473 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 26 09:37:05 compute-1 nova_compute[183083]: 2026-01-26 09:37:05.473 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 26 09:37:05 compute-1 nova_compute[183083]: 2026-01-26 09:37:05.559 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 26 09:37:05 compute-1 nova_compute[183083]: 2026-01-26 09:37:05.559 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 26 09:37:05 compute-1 nova_compute[183083]: 2026-01-26 09:37:05.601 183087 DEBUG nova.compute.provider_tree [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed in ProviderTree for provider: 5203935e-446c-4e03-93fa-4c60d651e045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 26 09:37:05 compute-1 nova_compute[183083]: 2026-01-26 09:37:05.625 183087 DEBUG nova.scheduler.client.report [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Inventory has not changed for provider 5203935e-446c-4e03-93fa-4c60d651e045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 26 09:37:05 compute-1 nova_compute[183083]: 2026-01-26 09:37:05.628 183087 DEBUG nova.compute.resource_tracker [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 26 09:37:05 compute-1 nova_compute[183083]: 2026-01-26 09:37:05.629 183087 DEBUG oslo_concurrency.lockutils [None req-fd8e5a92-f388-4d90-8b22-69d3fe61c77c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 26 09:37:07 compute-1 nova_compute[183083]: 2026-01-26 09:37:07.348 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:37:07 compute-1 nova_compute[183083]: 2026-01-26 09:37:07.349 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:37:07 compute-1 nova_compute[183083]: 2026-01-26 09:37:07.349 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 26 09:37:07 compute-1 nova_compute[183083]: 2026-01-26 09:37:07.349 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:37:07 compute-1 nova_compute[183083]: 2026-01-26 09:37:07.350 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 26 09:37:07 compute-1 nova_compute[183083]: 2026-01-26 09:37:07.351 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 26 09:37:08 compute-1 ovs-vsctl[234405]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 26 09:37:08 compute-1 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 234256 (sos)
Jan 26 09:37:08 compute-1 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 26 09:37:08 compute-1 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 26 09:37:09 compute-1 virtqemud[182752]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 26 09:37:09 compute-1 virtqemud[182752]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 26 09:37:09 compute-1 virtqemud[182752]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 26 09:37:10 compute-1 crontab[234793]: (root) LIST (root)
Jan 26 09:37:12 compute-1 nova_compute[183083]: 2026-01-26 09:37:12.351 183087 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 26 09:37:12 compute-1 systemd[1]: Starting Hostname Service...
Jan 26 09:37:12 compute-1 systemd[1]: Started Hostname Service.
Jan 26 09:37:13 compute-1 ovn_controller[95352]: 2026-01-26T09:37:13Z|00406|pinctrl|WARN|Dropped 31 log messages in last 60 seconds (most recently, 15 seconds ago) due to excessive rate
Jan 26 09:37:13 compute-1 ovn_controller[95352]: 2026-01-26T09:37:13Z|00407|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
